[D] How do you keep up with latest advances in ML which are not directly relevant to your work?
In recent years I have heard of key advances made in NLP by transformer models and BERT. Then there was this paper on neural ODEs by DeepMind.
I have been wanting to dig deeper into the details and understand the key ideas behind these hot topics. They are not directly relevant to my work (which is focused mainly on images/videos) but I still feel it is important as an ML engineer to keep myself up-to-date with key developments in diverse areas. However because it does not directly relate to my work, I find it hard to find the time to get a deeper understanding of these papers.
Has anyone found themselves in a similar situation? How do you deal with it?