Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
As far as I understand BERT can work as kind of embedding but context-sensitive.
Can I use pretrained BERT like pretrained embedding in my model?
If I can, what simplest way to do so?
In general, I want to make something like a context-sensitive replacement for char/word lvl default embeddings for my models.
submitted by /u/hadaev
[link] [comments]