Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
![]() |
Hyperband is a state-of-the-art algorithm for hyperparameter tunning that focuses on resource efficiency. It does so by encorperating early-stopping into it’s strategy. Here are some of the results: For more, go here: http://www.argmin.net/2016/06/23/hyperband/ I was unable to find any great implementations of hyperband, so I implemented it! Here it is: https://gist.github.com/PetrochukM/2c5fae9daf0529ed589018c6353c9f7b The implementation is commented and documented to help ensure correctness and improve code readability. I believe I improve hyperband by allowing support for model checkpoints. The original hyperband assumed that each model was trained from scratch instead of checkpointing. We don’t need to train the same model with the same hyperparameters over and over again! Finally, I also explored other improvements to hyperband like splitting based on the largest performance gap instead of splitting in half the search space every time. submitted by /u/Deepblue129 |