It will probably work. I was just curious to see if it could be done automatically without changing the param_space
.
The way I currently use custom hyperopt
spaces does not work with Tune out of the box. For example, assume the following conditional search space:
space = {
##### Augmentation ######
# Cutmix
"cutmix": hp.choice(
"ctmx",
[
{
"cutmix": True,
"cutmix_beta": 1.0,
"cutmix_prob": hp.quniform("cutmix_prob", 0.10, 0.50, 0.05),
},
{"cutmix": False},
],
),
}
The configuration that Tune passes to workers looks like this:
{"cutmix": {"cutmix": False}}
If your code is designed to receive cutmix
, cutmix_beta
and cutmix_prob
, the provided config will not work, i.e., you need to process spaces defined in Tune, which only have the expected keys, and the ones defined with hyperopt
, which may have nested dictionaries, differently.
So I have written a helper function to first flatten the nested dictionaries, and use the updated config
in the worker.
def hyperopt_to_ray(config):
# Flatten nested config when HyperOpt custom space is defined
outer_keys = []
for key in config:
if isinstance(config[key], dict):
outer_keys.append(key)
for key in outer_keys:
inner_dict = config.pop(key)
config.update(inner_dict)
Changing the param_space
definition to something like param_space = {"train_loop_config": param_space}
would make this whole process of dealing with spaces more complicated.