Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] Fitting (almost) any PyTorch module with just one line, including easy BERT fine-tuning

Hi everyone,

My name is Dima and I wanted to tell you about an open-source library we work on called TamnunML.

Our goal is to provide an easy to use library (with sklearn interface) for complex model training and fine-tuning. For example, with tamnun you can train any pytorch module like this:

“`python from torch import nn from tamnun.core import TorchEstimator

module = nn.Linear(128, 2) clf = TorchEstimator(module, task_type=’classification’).fit(train_X, train_y) “`

or, you can fine tune BERT on your task as easy as: “`python from tamnun.bert import BertClassifier, BertVectorizer from sklearn.pipeline import make_pipeline

clf = make_pipeline(BertVectorizer(), BertClassifier(num_of_classes=2)).fit(train_X, train_y) predicted = clf.predict(test_X) “`

At the moment tamnun supports training (almost) any pytorch module using just a “fit” method, easy BERT fine-tuning and model distillation.

You can read more about how to train (almost) any pytroch module with tamnun here

The library github page.

The introduction to TamnunML of the library we published in our blog.

submitted by /u/sudo_su_
[link] [comments]