Tune: Some hyper parameters format are not good for search algorithm

How do you perform a hyperparameter search with RLLib?
One problem I face is the format of the config parameter
For example, the ‘fcnet_hiddens’ is an array of each layer size, but, often we want the same size for each layer.
Maybe give the possibility to tune the number of layers and the size of each layer as an independent parameter?

Also, the schedule config is not suitable for some search algorithms.
HyperOpt struggles with the “lr_schedule” and “entropy_schedule” config because it’s a matrix.

What’s your take on it?

1 Like

Thanks for the question. Let’s tag this a tune question (I have actually wondered about this problem myself :slight_smile: ).

Yes, it’s more correct to tag it as a tune question.

To overcome this, the only simple way I’ve found was to create config parameters for the size of the layer and the number of layers.
And then, in the training script, I transform it into the wanted RLLib form and pop the custom config from the config dictionary.
Then, to do the hyperparameter tuning I use WanDB and create a sweep.
Here are some code to make it more clear:

config['model'] = {
   "fcnet_activation": "tanh",
   "custom_model": "fc_masked_model",
    'fcnet_hiddens': [config['layer_size'] for k in range(config['layer_nb'])],
}
config = with_common_config(config)
config.pop('layer_size', None)
config.pop('layer_nb', None)

But, it’s not possible to do this with Tune as Tune runs directly the Policy :thinking:

1 Like