[D] – Bayesian Optimization with dynamic number of parameters
Hi Everyone,
I would like to use Bayesian Optimization to tune some hyperparameters. The problem is that the number of hyperparameters I want to tune is also a hyperparameter. These hyperparameters all relate to the same thing, i.e. all of them are learning rate, but I am not sure of the number of learning rate hyperparameters I need.
So I want to tune the number of hyperparameters n
and tune n
hyperparameters. I was thinking of justing having the max length of hyperparameters and then also have a parameter for the num of hyperparameters and selectiviely choosing only n
parameters. Also since number of hyperparameters is an int, I know this is not ideal. Is there a more principled approach? Would I need to implement a specialized kernel function for this?
submitted by /u/ktessera
[link] [comments]