[P] GlyphNet: Training neural networks to communicate with a visual language
Visualization of glyphs generated by neural network I did an experiment over winter break to see what would happen if I trained 2 neural networks to communicate with each other in a noisy environment. The task of the first neural network is to generate unique symbols, and the other’s task is to tell them apart. The result is a pretty cool visual language that looks kind of alien. Notably, I got the best results by dynamically increasing the noise parameters as the networks became more competent (pulling inspiration from Automatic Domain Randomization and POET). Please take a look and let me know what you think! https://github.com/noahtren/GlyphNet submitted by /u/noahtren |