[P] A library of pretrained models for NLP: Bert, GPT, GPT-2, Transformer-XL, XLNet, XLM
Huggingface has released a new version of their open-source library of pretrained transformer models for NLP: PyTorch-Transformers 1.0 (formerly known as pytorch-pretrained-bert).
The library now comprises six architectures:
- Google’s BERT,
- OpenAI’s GPT & GPT-2,
- Google/CMU’s Transformer-XL & XLNet and
- Facebook’s XLM,
and a total of 27 pretrained model weights for these architectures.
The library focus on:
- being superfast to learn & use (almost no abstractions),
- providing SOTA examples scripts as starting points (text classification with GLUE, question answering with SQuAD and text generation using GPT, GPT-2, Transformer-XL, XLNet).
It also provides:
- a unified API for models and tokenizers,
- access to the hidden-states and attention weights,
- compatibility with Torchscript…
Install: pip install pytorch-transformers
Quickstart: https://github.com/huggingface/pytorch-transformers#quick-tour
Release notes: https://github.com/huggingface/pytorch-transformers/releases/tag/v1.0.0
Documentation (work in progress): https://huggingface.co/pytorch-transformers/
submitted by /u/Thomjazz
[link] [comments]