Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
A simple tutorial on how to apply pre-trained BERT model to Korean QA task.
A pre-trained BERT model is publicly available !
huggingface/pytorch-pretrained-BERT repository contains op-for-op PyTorch reimplementations, pre-trained models and fine-tuning examples for Google’s BERT model. And as a result of submission BERT fine-tuned with default hyper-parameter, it ranked 30th with EM= 71.47, F1= 89.71 on the KorQuAD leaderboard.
So, I covered the process of fine-tuning and submitting BERT and result for official evaluation on KorQuAD. Once your BERT model has been evaluated officially, scores will be added to the leaderboard.
submitted by /u/lyeoni
[link] [comments]