Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[N] HuggingFace releases Transformers 2.0, a library for state-of-the-art NLP in TensorFlow 2.0 and PyTorch

HuggingFace has just released Transformers 2.0, a library for Natural Language Processing in TensorFlow 2.0 and PyTorch which provides state-of-the-art pretrained models in most recent NLP architectures (BERT, GPT-2, XLNet, RoBERTa, DistilBert, XLM…) comprising several multi-lingual models.

An interesting feature is that the library provides deep interoperability between TensorFlow 2.0 and PyTorch.

You can move a full model seamlessly from one framework to the other during its lifetime (instead of just exporting a static computation graph at the end like with ONNX). This way it’s possible to get the best of both worlds by selecting the best framework for each step of training, evaluation, production, e.g. train on TPUs before finetuning/testing in PyTorch and finally deploy with TF-X.

An example in the readme shows how Bert can be finetuned on GLUE in a few lines of code with the high-level API tf.keras.Model.fit() and then loaded in PyTorch for quick and easy inspection and debugging.

As TensorFlow and PyTorch as getting closer, this kind of deep interoperability between both frameworks could become a new norm for multi-backends libraries.

Repo: https://github.com/huggingface/transformers

submitted by /u/Thomjazz
[link] [comments]