[P] The age of transformers & Understanding text with BERT
This is a two part blog post on a Project that aims to do question answering, using a pretrained BERT.
The first part teaches about Transformers, and the history that leads up to this Architecture. -> https://blog.scaleway.com/2019/building-a-machine-reading-comprehension-system-using-the-latest-advances-in-deep-learning-for-nlp/
The second part focuses on using a pre-trained BERT (in PyTorch) and how to do question answering. There’s code and you can try it on your own dataset easily 🙂