Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
In recent years I have heard of key advances made in NLP by transformer models and BERT. Then there was this paper on neural ODEs by DeepMind.
I have been wanting to dig deeper into the details and understand the key ideas behind these hot topics. They are not directly relevant to my work (which is focused mainly on images/videos) but I still feel it is important as an ML engineer to keep myself up-to-date with key developments in diverse areas. However because it does not directly relate to my work, I find it hard to find the time to get a deeper understanding of these papers.
Has anyone found themselves in a similar situation? How do you deal with it?
submitted by /u/nivter
[link] [comments]