[P] Do you know SQuAD dataset? Try KorQuAD(Korean SQuAD dataset) with pre-trained BERT!
A simple tutorial on how to apply pre-trained BERT model to Korean QA task.
A pre-trained BERT model is publicly available !
huggingface/pytorch-pretrained-BERT repository contains op-for-op PyTorch reimplementations, pre-trained models and fine-tuning examples for Google’s BERT model. And as a result of submission BERT fine-tuned with default hyper-parameter, it ranked 30th with EM= 71.47, F1= 89.71 on the KorQuAD leaderboard.
So, I covered the process of fine-tuning and submitting BERT and result for official evaluation on KorQuAD. Once your BERT model has been evaluated officially, scores will be added to the leaderboard.