[P] Pytorch library of NLP pre-trained models has a new model to offer: RoBERTa
Huggingface has released a new version of their open-source library of pre-trained transformer models for NLP: pytorch-transformers 1.1.0.
On top of the already integrated architectures: Google’s BERT, OpenAI’s GPT & GPT-2, Google/CMU’s Transformer-XL & XLNet and Facebook’s XLM, they have added Facebook’s RoBERTa, which has a slightly different pre-training approach than BERT while keeping the original model architecture.
The RoBERTa model gets SOTA results on SuperGLUE.
pip install pytorch-transformers