Implementing a Custom Learning Rate Scheduler
Implement a custom learning rate scheduler that follows a cosine annealing schedule. The learning rate starts high and decreases smoothly to a minimum value, then resets and repeats. Your scheduler should be a subclass of torch.optim.lr_scheduler.LRScheduler
and should update the learning rate based on the current epoch.
where is the current epoch and is the total number of epochs for one cycle.
Verification: Train a small model for several epochs and print the learning rate at each epoch. The values should follow the expected cosine curve, with resets at the end of each cycle.