Conditional grid search for search space

Let’s say there are two hyperparameters a and b. I want to perform search over the space [{“a”: 1, “b”: True}, {“a”: 2, “b”: True}, {“a”: 2, “b”: False}].

One way is to define the search space as {“a”: grid_search([1,2]), “b”: grid_search([True, False])}. However, the result of a=1 and b=False is not useful and wastes tuning time. Is it possible to search over only 3 trials in my case? Thanks!.

Hi @Lin_Lan,

Try:
key: {“a”: grid_search([1,2]), “b”: sample_from(lambda spec: spec.config.key.a == 1)}

You can find more information in the documentation here: custom-conditional-search-spaces

Thank @mannyv for your reply.

My real case is a little more complex. Now, I use a for loop and multiple calls of tune.run as follows:

for xtype in ["a", "b", "c"]:
    config = {
        "param1": grid_seach(....)
        "param2": grid_seach(....)
        "param3": grid_seach(....)
    }
    ...
    ...
    if xtype == "a":
        config["flag_1"] = grid_search([False])
        config["flag_2"] = grid_search([False])
    elif xtype == "b":
        config["flag_1"] = grid_search([True, False])
        config["flag_2"] = grid_search([True, False])
    elif xtype == "c":
        config["flag_1"] = grid_search([False])
        config["flag_2"] = grid_search([True, False])
    tune.run(config=config, ...)

Can we first merge different configurations generated by different xtype and then only call tune.run once?

Would it work for your case to adapt this example to make an experiment for each xtype and then use tune to run them.

Each experiment could have a different config with grid_searche and sample_from entries.

Also in your example above you do not need to put a single value in a grid_searche. You could just set that key to the value. I do what you did sometimes if I want it to show up in the filename or tensor board name.

Hi @mannyv, I think the above example works in my case. I want to confirm if the two experiments are run sequentially or merged.

For example, the trials of exp_1 and exp_2 need 2 and 6 gpus, respectively, and in a server with 8 gpus, will all the trials run in parallel, or the trials of exp_2 start after those of exp_1 finish?