Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] A library of pretrained models for NLP: Bert, GPT, GPT-2, Transformer-XL, XLNet, XLM

Huggingface has released a new version of their open-source library of pretrained transformer models for NLP: PyTorch-Transformers 1.0 (formerly known as pytorch-pretrained-bert).

The library now comprises six architectures:

  • Google’s BERT,
  • OpenAI’s GPT & GPT-2,
  • Google/CMU’s Transformer-XL & XLNet and
  • Facebook’s XLM,

and a total of 27 pretrained model weights for these architectures.

The library focus on:

  • being superfast to learn & use (almost no abstractions),
  • providing SOTA examples scripts as starting points (text classification with GLUE, question answering with SQuAD and text generation using GPT, GPT-2, Transformer-XL, XLNet).

It also provides:

  • a unified API for models and tokenizers,
  • access to the hidden-states and attention weights,
  • compatibility with Torchscript…

Install: pip install pytorch-transformers

Quickstart: https://github.com/huggingface/pytorch-transformers#quick-tour

Release notes: https://github.com/huggingface/pytorch-transformers/releases/tag/v1.0.0

Documentation (work in progress): https://huggingface.co/pytorch-transformers/

submitted by /u/Thomjazz
[link] [comments]