[D] The state of transfer learning in NLP
http://ruder.io/state-of-transfer-learning-in-nlp/
This blog post by Sebastian Ruder is a quick review of how natural language processing has benefited from transfer learning. He ties together how recent advances (e.g., pretrained models/BERT, optimization schemes, multitask fine-tuning, etc) can work together to improve language modeling, and also poses some open problems in the field. See also the (somewhat empty) HN discussion.
submitted by /u/jwuphysics
[link] [comments]