[D] Most outlandish application of Transformer Architechture
I’m conducting some independent research on the effectiveness of Transformer, Attention, GPT, BERT structure on tasks outside the domain of Language. I was curious to know what the most outlandish implementation you have done may be or the coolest cross-domain application you can think of. Lets see how much we can Transform the Transformer!
submitted by /u/ThatAi_guy
[link] [comments]