ML Katas

Implementing a Simple VAE for Text (Sentence VAE)

hard (>1 hr) rnn vae generative text nlp
this month by E

Implement a Variational Autoencoder (VAE) for text, often called a Sentence VAE. The encoder will be an RNN (e.g., GRU) that outputs a latent distribution, and the decoder will be another RNN that generates a sequence of tokens from a sample of this distribution. You'll need to use nn.Embedding for token representation.

Verification: After training, sample from the latent space and use the decoder to generate new sentences. These sentences should be grammatically coherent and semantically plausible, reflecting the style of the training data. For example, if trained on a corpus of movie reviews, the generated sentences should resemble reviews.