Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I’ve typically only performed transfer learning via fine-tuning with neural networks (eg. image classifiers from pre-trained MobileNet, etc.), but does the same idea hold for a model like logistic regression or CRF? I’d argue yes because your essentially just training a new model with non-randomized initial weights (a prior). But am I missing something?
I’m currently looking into cross-domain transfer learning for non-neural NER models, and I wanted to fine-tune the weights of a pre-trained CRF with some newly annotated user-generated data.
submitted by /u/Lewba
[link] [comments]