Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Too many hyperparameters to tune too little time

I’m working on a model with heaps of hyperparameters. It is infeasible to test all combinations so I’ve come up with an attempt to tuning, but I don’t know whether the method is valid. Say I have hyperparameters A, B, and C, each with 3, 4 and 5 options each. Now my plan is to set a baseline, say A:1, B:1, C:1. Then I vary the options of A keeping B and C constant. Hypothetically A:3, B:1, C:1 beats the initial baseline. Now I set A:3, B:1, C:1 to be my new baseline and I vary hyperparameter B. I repeat this process until all parameters have been varied. Then I start out with A again. The assumption here is that hyperparameters influence the performance which I know not to be true.

Can this method be seen as a genuine attempt to tuning? If it is: does anyone know of any references where this or a similar tuning method is used? If not: is there a better method? Furthermore, I’d like to know how you deal with having a lot of hyperparameters.

submitted by /u/matigekunst
[link] [comments]