[D] Does Bert give by default word embedding or sentence embedding ?
Hey all
Since Bert is a language model, by default do we obtain sentence or word embedding?
I actually plan to use these embeddings for various NLP related tasks like Sentence Similarity, NMT, Summarization etc.
Also :
- If it by default gives Sentence Level Embedding then what is the process to get Word Embedding ( any refer might help here ).
- If we obtain Word Embeddings then do we just simply do Mean/Max pooling to get Sentence embedding or are there better approaches?
submitted by /u/amil123123
[link] [comments]