Enqueuing trial for PBT / specifying starting parameters / `points_to_evaluate`

TL;DR: Is it possible to enqueue a trial for PBT to start with? That is, tell PBT to start at least one trial with specified hyperparameters?

Context: I have a larger space of hyperparameters. Therefore, I would like PBT to start off with a set of hyperparameters that work “pretty well” (as determined by previous experiments). When doing “classical” hyperparameter optimization with the optuna backend, I can pass in point_to_evaluate as starting points. However, there doesn’t seem to be a way of specifying “initial guesses” for good hyperparameters for PBT (and yes, they would get modified/mutated later anyway, but starting “in the dark” doesn’t seem to work very well for me).

Hi @kemok,

You should be able to use the BasicVariantGenerator and pass in points_to_evaluate to start samples with certain configurations.

Here’s an example:

from ray.tune.search.basic_variant import BasicVariantGenerator

tuner = Tuner(
    train_func,
    param_space={
        "lr": tune.loguniform(5e-4, 1e-2),
        "h0": tune.grid_search([0.0, 1.0]),
        "h1": tune.sample_from(lambda spec: 1. - spec.config["h0"]),
    },
    tune_config=TuneConfig(
        num_samples=2,
        metric="Q", mode="max",
        scheduler=pbt_scheduler,
        reuse_actors=True,
        search_alg=BasicVariantGenerator(points_to_evaluate=[
            {"lr": 0.01, "h0": 0.9},
        ]),
    ),
    run_config=RunConfig(
        stop={"training_iteration": 70},
        failure_config=FailureConfig(max_failures=3),
    ),
)

I set num_samples=2 here, but this will create 3 trials:

  1. The first sample will use the lr=0.01, h0=0.9 I specified in points_to_evaluate. You’ll get a warning that 0.9 is not in the set [0.0, 1.0] if you choose a value that’s not in the grid search, which is fine if that’s what you intend.
  2. The second sample will spawn 2 trials according to the grid_search over h0.

Let me know if this solves your problem!

1 Like