Join our meetup, learn, connect, share, and get to know your Toronto AI community.Â
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Since our recent release of Transformers (previously known as pytorch-pretrained-BERT and pytorch-transformers), we’ve been working on a comparison between the implementation of our models in PyTorch and in TensorFlow.
We’ve released a detailed report where we benchmark each of the architectures hosted on our repository (BERT, GPT-2, DistilBERT, …) in PyTorch with and without TorchScript, and in TensorFlow with and without XLA. We benchmark them for inference and the results are visible in the following spreadsheet.
We would love to hear your thoughts on the process.
submitted by /u/jikkii
[link] [comments]