Grouped param space?

How severe does this issue affect your experience of using Ray?

  • Medium: It contributes to significant difficulty to complete my task, but I can work around it.

Hello,

I’m basically trying to condition the existence of a hyperparameter A based on the values of another hyperparameter B. Practically, hyperparameter B is the choice of optimizer, and hyperparameter A are all of the different ways to configure the optimizer.

For instance, if I set hyperparameter B as a beta values, that would only work on Adam optimizer.

How can I write a configuration so that, if hyperparameter B is Adam, then I want to try out different values of betas. If it is not Adam, then I absolutely do not want to sample values for betas, but instead, if the hyperparameter B was SGD for instance, I want to sample values for momentum.

Hi @lthiet,

You may want to check out Tune’s integration with Optuna, which allows you to define a search space programmatically and includes branching logic.

Take a look at the guide on the ray docs here:

https://docs.ray.io/en/latest/tune/examples/optuna_example.html#conditional-search-spaces

If you want to go through Tune search spaces, you can also handle this branching logic inside your training loop:

def train_fn(config):
    if config["optimizer"] == "adam":
        optimizer = Adam(..., lr=config["lr"], betas=config["betas"])
    elif config["optimizer"] == "sgd":
        optimizer = SGD(..., lr=config["lr"])
    # ...

tuner = Tuner(
    train_fn,
    param_space={
        "optimizer": tune.grid_search(["adam", "sgd"]),
        "lr": tune.uniform(...),
        "betas": tune.choice([[0.9, 0.99], [0.9, 0.999]]),
    }
)