[P] Pytorch library of NLP pre-trained models has a new model to offer: RoBERTa
Huggingface has released a new version of their open-source library of pre-trained transformer models for NLP: pytorch-transformers 1.1.0.
On top of the already integrated architectures: Google’s BERT, OpenAI’s GPT & GPT-2, Google/CMU’s Transformer-XL & XLNet and Facebook’s XLM, they have added Facebook’s RoBERTa, which has a slightly different pre-training approach than BERT while keeping the original model architecture.
The RoBERTa model gets SOTA results on SuperGLUE.
Install: pip install pytorch-transformers
Quickstart: https://huggingface.co/pytorch-transformers/quickstart.html
Release notes: https://github.com/huggingface/pytorch-transformers/releases/tag/1.1.0
Documentation: https://huggingface.co/pytorch-transformers/
submitted by /u/jikkii
[link] [comments]