- 
                
                    Custom `nn.Module` with a Non-standard InitializationCreate a **custom `nn.Module`** for a simple feed-forward layer. Instead of the default PyTorch initialization, you'll apply a specific, non-standard initialization scheme. For example, you could... 
- 
                
                    Implementing a Custom Loss Function with `torch.autograd`Create a **custom loss function** that inherits from `torch.nn.Module` and performs a non-standard calculation. For example, a custom Huber loss. This loss is less sensitive to outliers than Mean... 
- 
                
                    Custom `DataLoader` for On-the-Fly Image GenerationCreate a **custom `torch.utils.data.Dataset`** that doesn't load data from disk. Instead, the `__getitem__` method should **generate** an image on the fly (e.g., a simple geometric shape, a random... 
- 
                
                    Implementing a Custom `nn.Module` for a Gated Recurrent Unit (GRU)Implement a **custom GRU cell** as a subclass of `torch.nn.Module`. Your implementation should handle the reset gate, update gate, and the new hidden state computation from scratch, using... 
- 
                
                    Custom Data Augmentation PipelineCreate a **custom data augmentation pipeline** using PyTorch's `transforms`. For a given dataset (e.g., a custom image dataset), implement a series of transformations like random rotation,... 
- 
                
                    Implementing a Custom Learning Rate SchedulerImplement a **custom learning rate scheduler** that follows a cosine annealing schedule. The learning rate starts high and decreases smoothly to a minimum value, then resets and repeats. Your... 
- 
                
                    Building a Custom `Dataset` and `DataLoader`Create a **custom `torch.utils.data.Dataset` class** to load a simple, non-image dataset (e.g., from a CSV file). The `__init__` method should read the data, `__len__` should return the total... 
- 
                
                    Implementing Layer Normalization from ScratchImplement **Layer Normalization** as a custom `torch.nn.Module`. Unlike `BatchNorm`, `LayerNorm` normalizes across the features of a single sample, not a batch. Your implementation should...