Nested and conditional spaces TuneSearchCV

Hello! I want to run a conditional or nested search space through TuneSearchCV. Here’s an example of what I’d like to do

lda_solver = tune.choice([‘svd’,‘lsqr’, ‘eigen’])
lda_shrinkiage = tune.sample_from(lambda spec: None if spec.config.classify__solver==‘svd’
else (np.random.uniform(0,1)))

lda_param_grid ={
“classify__solver”: lda_solver
, “classify__shrinkage”: lda_shrinkiage
}

tuned = TuneSearchCV(lda_pipe,
param_distributions=lda_param_grid,
n_trials=300,
scoring=“average_precision”,
max_iters=1,
search_optimization=“bayesian”,
n_jobs=-1,
refit=False,
cv= 10,
verbose=1,
#loggers = “tensorboard”,
random_state=42,
use_gpu=False
)
tuned.fit(X_train, y_train)

But I get
ValueError: SkOpt does not support parameters of type Function with samplers of type NoneType

Alternatively, it would also achieve what I want if I could set up a nested space like this:
lda_param_grid = {
‘solver’: tune.choice( [
{‘solver’:‘svd’}
,{‘solver’:‘lsqr’,‘shrinkage’:tune.choice([‘auto’, tune.uniform( 0, 1)])}
,{‘solver’:‘eigen’,‘shrinkage’:tune.choice[‘auto’, tune.uniform( 0, 1)])}
])
}

cc @Yard1 who is looking into improving conditional search spaces soon.

However, conditional search spaces are unlikely to work with custom searchers (such as BayesOpt, which you are using) anytime soon or ever. Can you try just passing search_optimization="random" instead?

I’m hoping for a “smarter” search method than random. I saw in the documentation other tune.suggest searchers can be passed for search_optimization=…, but when I try something like
from ray.tune.suggest.bayesopt import BayesOptSearch

search_optimization=BayesOptSearch

I get
AttributeError: type object ‘BayesOptSearch’ has no attribute ‘lower’

Hey @rmelvin , as Kai pointed out, we do not really support conditional parameters yet, and adding support for them will be tricky (and we won’t be able to add support for them to search algorithms that don’t support them internally). In the meantime, I was thinking about adding support for Optuna define-by-run API, which would allow you to specify your own function with conditions directly in the code. Would that be helpful in your usecase? Optuna supports bayesian optimization with TPEs.

Hi @Yard1, that would help. My goal is to do a Bayesian search on a conditional search space. Thanks!