- 
                
                    Implementing the Adam Optimizer from ScratchImplement the **Adam optimizer from scratch** as a subclass of `torch.optim.Optimizer`. You'll need to manage the first-moment vector (moving average of gradients) and the second-moment vector... 
            
            
                
                    1