-
Custom `nn.Module` with a Non-standard Initialization
Create a **custom `nn.Module`** for a simple feed-forward layer. Instead of the default PyTorch initialization, you'll apply a specific, non-standard initialization scheme. For example, you could...
-
Implementing a Simple Attention Mechanism
Implement a **simple attention mechanism** for a sequence-to-sequence model. Given a sequence of encoder outputs and a single decoder hidden state, your attention module should compute attention...
1