Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Hyperparam optimisation using RandomSearch with argparse scripts?

I tend to write complex model/training scripts in pure python using argparse to pass huge amounts of hyperparameters to the model, and then run these python scripts on multi-gpu EC2s.

I was wondering if anyone knows of any tools out there what would allow me to do hyperparam optimisation by passing different sets of hyperparams to these scripts via the argparse/commandline system? So imagine you have a process which generates hyperparam sets, then kicks off a subprocess which is the python training script with hyperparams passed in via commandline/argparse, then gets the metrics back, stores them, then kicks off the next set, etc, etc.

For basic grid search, you could easily accomplish this via a unix shell script, but for random search it’s trickier. One possible solution would be to write a python script which uses sklearn’s ParameterSampler and the subprocess module to accomplish all this, but I was curious if there is an already made solution out there which I could use? Would hate to reinvent the wheel if this particular wheel already exists out there somewhere.

Would greatly appreciate any help/tips.

submitted by /u/trias10
[link] [comments]