[D] Neural Computers for Text Generation
In general these models should do a good job as they are essentially RNNs with an (theoretically) unlimited memory to write and read to and indeed have been proved to generalize better with respect to sequence length.
Reasons I see why such models haven’t been applied to the problem of text generation yet could be, that they are simply too complex and make it a burden to work with them. Of course it’s not something with the difficulty of string theory but nevertheless the entrance barriers for understanding and actually working with them are much bigger than for the usual CNN or RNN. Combined with the fact that the NLP community is much smaller than the one of CV or RL, there is probably also just a low probability that someone with the necessary skills and interests comes across this ideas.