[P] I created a Transformer Model package in Tensorflow 2.0 that is extensible and can be used to rebuild GPT-2, BERT, and XLNet.
pip install transformer-model
I recently took some time to build out an extensible Transformer Model in TF2, mostly for my own future use cases but I thought I’d share with you and possibly get some feedback as well. I have not created many python packages, so if there’s something I missed or seems out of place feel free to create an issue on the repo.
The goal of this project was to create all of the core pieces of the Transformer Model discussed in the “Attention is all you need” paper in a way that I could reuse them to create newer, more SOTA models like BERT and XLNet. I’ve left instructions on how to use this package to train a Transformer model and will be packaging this to go on pypi later today.
My hope is this package saves someone some dev time. If it does, please give the package a star!