Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] – Bayesian Optimization with dynamic number of parameters

Hi Everyone,

I would like to use Bayesian Optimization to tune some hyperparameters. The problem is that the number of hyperparameters I want to tune is also a hyperparameter. These hyperparameters all relate to the same thing, i.e. all of them are learning rate, but I am not sure of the number of learning rate hyperparameters I need.

So I want to tune the number of hyperparameters n and tune n hyperparameters. I was thinking of justing having the max length of hyperparameters and then also have a parameter for the num of hyperparameters and selectiviely choosing only n parameters. Also since number of hyperparameters is an int, I know this is not ideal. Is there a more principled approach? Would I need to implement a specialized kernel function for this?

submitted by /u/ktessera
[link] [comments]