Updating model parameters
In this exercise, you will implement a full training step for the regression problem that you have been working on.
- Instantiate your model parameters,
Wandb, and your dataxandy_true. Remember to use a PRNG key. - Compute the gradients of your loss function with respect to your model parameters.
- Update your parameters using gradient descent. That is, and for some learning rate . You can set .
- Wrap this all in a function that you
jitcompile. How much of a speedup do you see?