I’ve been trying to run hyperparameter optimisation using Tune for RLlib, where many configuration parameters are inside nested dictionaries inside the config. With both HyperOpt and BayesOpt, this works at first, but after some number of trials, both of them eventually crash. I’ve opened an issue on github. But I also wanted to check if it’s anything I’m doing that’s causing this. If I want to optimise nested hyperparameters like config["exploration_config"]["initial_epsilon"]
, do I need to do anything special?
Re-posting Kai’s answer on the github issue:
As a workaround for hyperopt, try this:
configs_to_try = [
{
"nested_config/lr": 0.008,
}
]
hyperopt internally flattens the config names. I guess we should do this automatically within tune for preprocessing of the configs to try
That does seem to fix it - fantastic. Thank you (and Kai).
1 Like