Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I wanted to share with you this really new project I just stumbled upon from facebook AI research.
Implementing gradient-based hyper-parameter optimization and meta-learning has always been hard because of the non-differentiable optimizers and the stateful, non functional models. This library is supposed to make things easier by replacing existing stateful models with stateless ones automatically at run-time. It also implement differentiable version for most of the the torch.optim optimizers (although you cant use third party ones out of the box).
This means that we can finally differentiate through the usual training loop code with very little changes!
I didn’t try the library myself but it seems really easy to implement and from the stars looks really promising. Let me know what you think.
submitted by /u/rikkajounin
[link] [comments]