Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Does Bert give by default word embedding or sentence embedding ?

Hey all

Since Bert is a language model, by default do we obtain sentence or word embedding?

I actually plan to use these embeddings for various NLP related tasks like Sentence Similarity, NMT, Summarization etc.

Also :

  • If it by default gives Sentence Level Embedding then what is the process to get Word Embedding ( any refer might help here ).
  • If we obtain Word Embeddings then do we just simply do Mean/Max pooling to get Sentence embedding or are there better approaches?

submitted by /u/amil123123
[link] [comments]