Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] CURL: How to learn better sentence embeddings

Hi there,

I’ve been working on a project about SOTA methods in learning sentence embeddings. In particular I took a look at Quickthoughts, which uses a word2vec-like objective, seeking to identify “related” sentences.

As such I’ve produced an open PyTorch implementation of QuickThoughts here: https://github.com/jcaip/quickthoughts

In adddition I tried to use the theoretical framework described by Arora et al. about contrastive unsupervised representation learning, to examine the effect of changing the context_size of Quickthoughts, and offer a possible modification that should increase performance. Unfortunately this was unsuccessful, but hopefully you’ll still find the work interesting.

Post: https://jcaip.github.io/Quickthoughts/

Please lmk if you have any questions/comments, thanks for reading!

submitted by /u/kingcai
[link] [comments]