Custom `nn.Module` with a Non-standard Initialization
Create a custom nn.Module
for a simple feed-forward layer. Instead of the default PyTorch initialization, you'll apply a specific, non-standard initialization scheme. For example, you could initialize weights from a uniform distribution between and biases to a fixed value, e.g., . This exercise focuses on a key part of model design often taken for granted. You'll need to define a __init__
method and a reset_parameters
method where you implement your custom logic.
Verification: Print the weights and biases of your custom module after initialization. Confirm that the values are within the range you specified and that the biases are set to your chosen constant.