But I get
ValueError: SkOpt does not support parameters of type Function with samplers of type NoneType
Alternatively, it would also achieve what I want if I could set up a nested space like this:
lda_param_grid = {
‘solver’: tune.choice( [
{‘solver’:‘svd’}
,{‘solver’:‘lsqr’,‘shrinkage’:tune.choice([‘auto’, tune.uniform( 0, 1)])}
,{‘solver’:‘eigen’,‘shrinkage’:tune.choice[‘auto’, tune.uniform( 0, 1)])}
])
}
cc @Yard1 who is looking into improving conditional search spaces soon.
However, conditional search spaces are unlikely to work with custom searchers (such as BayesOpt, which you are using) anytime soon or ever. Can you try just passing search_optimization="random" instead?
I’m hoping for a “smarter” search method than random. I saw in the documentation other tune.suggest searchers can be passed for search_optimization=…, but when I try something like
from ray.tune.suggest.bayesopt import BayesOptSearch
…
search_optimization=BayesOptSearch
…
I get
AttributeError: type object ‘BayesOptSearch’ has no attribute ‘lower’
Hey @rmelvin , as Kai pointed out, we do not really support conditional parameters yet, and adding support for them will be tricky (and we won’t be able to add support for them to search algorithms that don’t support them internally). In the meantime, I was thinking about adding support for Optuna define-by-run API, which would allow you to specify your own function with conditions directly in the code. Would that be helpful in your usecase? Optuna supports bayesian optimization with TPEs.