[P] Generating Words From Embeddings
I made a blog post a while back on Generating Words From Embeddings. It’s a simple project which aims to create new meaningful words by generating them character by character, conditioned on a word embedding.
Now, I finally got around to making a simple colab notebook which makes it very easy to play around with the model and sample new words in a matter of minutes. I’d love to see what weird and interesting words you encounter when messing around with it!
Also, I made this quite a while back, so I only experimented with a simple decoder RNN (GRU/LSTM). Given the leaps and bounds by which NLP research has grown since then, it might be worth trying out more models (perhaps transformers) and seeing if they can generate qualitatively more pleasing words.