Implementing a Custom Loss Function with `torch.autograd`
Create a custom loss function that inherits from torch.nn.Module
and performs a non-standard calculation. For example, a custom Huber loss. This loss is less sensitive to outliers than Mean Squared Error. The key here is to correctly define the forward
method using torch
operations so that gradients can be computed automatically. No need for a custom autograd.Function
unless the operation is truly non-differentiable.
Verification: Train a simple linear regression model using your custom loss and torch.nn.MSELoss
. With a dataset containing outliers, your model trained with Huber loss should be less affected by them, resulting in a more robust fit. You can plot the predictions of both models to visually verify.