TuneBOHB does not search

TuneBOHB keeps suggesting the same values.

from ray import tune
from ray.tune.schedulers import HyperBandForBOHB
from ray.tune.suggest.bohb import TuneBOHB
def my_trainable(config, checkpoint_dir=None):
    print(">>>>", config)
    def func(x, a, b, c, d):
        return (a + b + c + d)**2 + ((x-5)**2)/100
    for x in range(10):
        score = func(x, config["a"], config["b"], config["c"], config["d"])
        tune.report(score=score, epoch=x)
config = {
    "a": tune.uniform(-50, 50),
    "b": tune.uniform(-50, 50),
    "c": tune.uniform(-50, 50),
    "d": tune.uniform(-50, 50),
}
algo = TuneBOHB(metric="score", mode="min")
bohb = HyperBandForBOHB(
    time_attr="training_iteration",
    metric="score",
    mode="min",
    max_t=100,
)
tune.run(my_trainable, config=config, scheduler=bohb, search_alg=algo)

Here are some of the printed log as an example.

(pid=15470) >>>> {'a': -43.98335462989923, 'b': -35.03938329680405, 'c': 18.273180510715463, 'd': -25.365756183735975}
(pid=15469) >>>> {'a': -43.98335462989923, 'b': -35.03938329680405, 'c': 18.273180510715463, 'd': -25.365756183735975}
(pid=15471) >>>> {'a': -43.98335462989923, 'b': -35.03938329680405, 'c': 18.273180510715463, 'd': -25.365756183735975}
(pid=15468) >>>> {'a': -43.98335462989923, 'b': -35.03938329680405, 'c': 18.273180510715463, 'd': -25.365756183735975}

I’m using ray==1.2.0.

algo.suggest("xxx") looks fine. It suggests a different config for each time I run it. Using ConfigurationSpace directly like below didn’t work either. (All processes got the same configs.)

import ConfigSpace as CS
from ray import tune
from ray.tune.schedulers import HyperBandForBOHB
from ray.tune.suggest.bohb import TuneBOHB
def my_trainable(config, checkpoint_dir=None):
    print(">>>>", config)
    def func(x, a, b, c, d):
        return (a + b + c + d)**2 + ((x-5)**2)/100
    for x in range(10):
        score = func(x, config["a"], config["b"], config["c"], config["d"])
        tune.report(score=score, epoch=x)
config_space = CS.ConfigurationSpace()
config_space.add_hyperparameter(CS.UniformFloatHyperparameter("a", lower=-50, upper=50))
config_space.add_hyperparameter(CS.UniformFloatHyperparameter("b", lower=-50, upper=50))
config_space.add_hyperparameter(CS.UniformFloatHyperparameter("c", lower=-50, upper=50))
config_space.add_hyperparameter(CS.UniformFloatHyperparameter("d", lower=-50, upper=50))
algo = TuneBOHB(config_space, metric="score", mode="min")
bohb = HyperBandForBOHB(
    time_attr="training_iteration",
    metric="score",
    mode="min",
    max_t=100,
)
tune.run(my_trainable, scheduler=bohb, search_alg=algo)

Hi @Multiwatts,

in the current code you are only running one sample at a time. If you pass e.g. num_samples=10 into tune.run() it should generate ten different configurations and start a trial for each.

Most likely you’ll want to use a ConcurrencyLimiter to limit the number of trials running in parallel. This way the configurations for the later running trials are informed by the results from the first trials.

Usually search algorithms require setting a random seed to reach reproducible suggestions you are experiencing. It might be that BOHB/ConfigurationSpace always uses a fixed random seed if none is provided.

1 Like