Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Neural Computers for Text Generation

Has anyone ever applied neural computer based models, e.g. Neural Turing Machine or Differentiable Neural Computer for text generation? In a short research I couldn’t find any such work.

In general these models should do a good job as they are essentially RNNs with an (theoretically) unlimited memory to write and read to and indeed have been proved to generalize better with respect to sequence length.

Reasons I see why such models haven’t been applied to the problem of text generation yet could be, that they are simply too complex and make it a burden to work with them. Of course it’s not something with the difficulty of string theory but nevertheless the entrance barriers for understanding and actually working with them are much bigger than for the usual CNN or RNN. Combined with the fact that the NLP community is much smaller than the one of CV or RL, there is probably also just a low probability that someone with the necessary skills and interests comes across this ideas.

submitted by /u/trashcoder
[link] [comments]