Skip to main content


Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] Feature Engineer Optimization in HyperparameterHunter 3.0

A full description of the new feature engineering optimization capabilities can be found in this Medium story.

TL;DR: HyperparameterHunter 3.0 adds support for feature engineering optimization. Define different feature engineering steps as normal functions, then let HyperparameterHunter keep track of the steps performed for Experiments, so you can optimize them just like normal hyperparameters, and learn from past Experiments automatically.

HyperparameterHunter is a scaffolding for ML experimentation and optimization. Run one-off Experiments or perform hyperparameter optimization, and HH automatically saves the model, hyperparameters, data, CV scheme, and now feature engineering steps, along with much more. Future optimization will scour your saved Experiments for those compatible with the current search space and use them to automatically jump-start learning.

  • Stop keeping janky lists of all your Experiments’ conditions and results
  • Ensure optimization actually has sufficient data to be useful
  • Let no Experiment be wasted

If you love HyperparameterHunter, I’d like to ask you for your support (yes, you, the attractive one reading this). Starring our GitHub repo, applauding the Medium story, and telling your friends (or enemies) about HyperparameterHunter would be very much appreciated!

If you’d like to do more and offer some feedback, open an issue, or contribute code, I would treasure the opportunity to learn from experts such as yourselves!

submitted by /u/HunterMcGushion
[link] [comments]