Blog

Learn About Our Meetup

4500+ Members

[P] Pytorch library of NLP pre-trained models has a new model to offer: RoBERTa

Huggingface has released a new version of their open-source library of pre-trained transformer models for NLP: pytorch-transformers 1.1.0.

On top of the already integrated architectures: Google’s BERT, OpenAI’s GPT & GPT-2, Google/CMU’s Transformer-XL & XLNet and Facebook’s XLM, they have added Facebook’s RoBERTa, which has a slightly different pre-training approach than BERT while keeping the original model architecture.

The RoBERTa model gets SOTA results on SuperGLUE.

Install: pip install pytorch-transformers

Quickstart: https://huggingface.co/pytorch-transformers/quickstart.html

Release notes: https://github.com/huggingface/pytorch-transformers/releases/tag/1.1.0

Documentation: https://huggingface.co/pytorch-transformers/

submitted by /u/jikkii
[link] [comments]

[P] GPT-2 small fine-tuned on The Stig intros from Top Gear.

I was struck by how funny the Stig intros generated from this project by u/fsaifdiwq was yesterday. So I wanted to see if I can get a GPT-2 to only do Stig intros. Due to the size of the dataset I can tell that it shows signs of overfitting, but many results are still new and quite fun.

Some say that he is confused by clouds and thinks the wind is behind the clouds. All we know is hes called the Stig.

Some say he has the worlds largest collection of toe nails that he has harvested off his bee hive and that he has a full body tattoo that could make a black hole. All we know is hes called the Stig.

Some say that he once slipped on an orange peel, some say that he ate a porcelain mug, without any tea. All we know is hes called the Stig.

Some say he tastes exactly the same as leprechauns, and that his favourite pastime is crushing other peoples skulls with his glass of milk. All we know is hes called the Stig.

Some say he won the Nobel Peace Prize, and that hes convinced Jesus and The Lion King are The Lion King 1 and 2. All we know is hes called the Stig.

Some say, ”There is a white paper on the problem with trees being bad for him but not for his skin”. Others say his eyelids are white, and that in 2007 he was convicted of the theft of The Lamps. All we know is hes called the Stig.

Some say he once punched a lady in the face and he was confused until he realised people arent that good at sitting still; and that his favorite language is Chinese. All we know is hes called the Stig.

And you can begin with a topic yourself and see how it finishes the introduction. Here I made it mention reddit:

Some say he uses reddit to his own detriment, by posting anti-Donald tweeters. All we know is hes called the Stig.

Some say he uses reddit to look at children’s books and that he is lucky starstruck with Justin Bieber. All we know is hes called the Stig.

Some say he uses reddit to look at nipples and that when he watches children on cell phone video he sees little to no reason to change them. All we know is hes called the Stig.

Some say reddit is his new bathroom partner, and that when he goes to the beach to have a bonfire. All we know is hes called the Stig.

Some say reddit is his social media following and that has not bought the entire worlds supply of earthworms for his new home planet on pluto just yet. All we know is hes called the Stig.

I hope you find them as amusing as I do!

submitted by /u/lilsmacky
[link] [comments]

[R] FoveaBox object detection code is available

https://github.com/taokong/FoveaBox

FoveaBox is an accurate, flexible and completely anchor-free object detection system for object detection framework, as presented in our paper https://arxiv.org/abs/1904.03797: Different from previous anchor-based methods, FoveaBox directly learns the object existing possibility and the bounding box coordinates without anchor reference. This is achieved by: (a) predicting category-sensitive semantic maps for the object existing possibility, and (b) producing category-agnostic bounding box for each position that potentially contains an object.

submitted by /u/taokongcn
[link] [comments]

[P] Pytorch Implementation of Autoregressive Language Model

A step-by-step tutorial on how to implement and adapt Autoregressive language model to Wikipedia text.

A pre-trained BERT, XLNET is publicly available ! But, for NLP beginners, It could be hard to use/adapt after full understanding. For them, I covered whole, end-to-end implementation process for language modeling, using unidirectional/bidirectional LSTM network, we already know.

  • – do not use torchtext library !
  • + include trained model file, training logs

I hope that this repo can be a good solution for people who want to have their own language model 🙂

https://github.com/lyeoni/pretraining-for-language-understanding

submitted by /u/lyeoni
[link] [comments]

[P] CLI tool to run DL machines on AWS

I created an open source tool to spin up Deep learning EC2 machines with a single command. The goal is to make it easy to use EC2 machines for development without fiddling with the AWS Console, managing SSH keys.

90-second demo of the tool: https://www.youtube.com/watch?v=lXEeteH3-So

Link to the project: https://github.com/narenst/infinity

It takes less than a minute to set up and use your first Deep Learning machine. I would love to hear your feedback 🙂

submitted by /u/narenst
[link] [comments]

Next Meetup

 

Days
:
Hours
:
Minutes
:
Seconds

 

Plug yourself into AI and don't miss a beat

 


Toronto AI is a social and collaborative hub to unite AI innovators of Toronto and surrounding areas. We explore AI technologies in digital art and music, healthcare, marketing, fintech, vr, robotics and more. Toronto AI was founded by Dave MacDonald and Patrick O'Mara.