ML Katas

Implementing a Custom Loss Function with `torch.autograd`

medium (<30 mins) regression loss custom function huber
this month by E

Create a custom loss function that inherits from torch.nn.Module and performs a non-standard calculation. For example, a custom Huber loss. This loss is less sensitive to outliers than Mean Squared Error. The key here is to correctly define the forward method using torch operations so that gradients can be computed automatically. No need for a custom autograd.Function unless the operation is truly non-differentiable.

Lδ(y,y^)={12(yy^)2if |yy^|δδ(|yy^|12δ)otherwise

Verification: Train a simple linear regression model using your custom loss and torch.nn.MSELoss. With a dataset containing outliers, your model trained with Huber loss should be less affected by them, resulting in a more robust fit. You can plot the predictions of both models to visually verify.