Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
HuggingFace released their first NLP transformer model “DistilBERT”, which is similar to the BERT architecture: only 66 million parameters (instead of 110 million) while keeping 95% of the performance on GLUE.
They released a blogpost detailing the procedure with a hands-on.
It is also available on their repository pytorch-transformers alongside 7 other transformer models.
submitted by /u/jikkii
[link] [comments]