Adding custom exploration

Hi,

Is there a way to add a custom exploration without changing the RLlib code directly?

In the doc it is mentioned that it is possible to sub-class the built-in explorations to add custom behaviour. But I cannot find a way to register these like we can do it for custom models (i.e. ModelCatalog.register_custom_model("custom_model", CustomModel)).

Does that mean we can only use the custom exploration by adding it to ray.rllib.utils.exploration.__init__?

Thank you

Hi @gabmit ,

and welcome to the discussion board. The RLlib documentation describes that in a short manner without a direct example, but what you do is, you take the Exploration class and subclass it to create your customized exploration behavior. This class can then be used in te exploration configuration in your Trainer.
Take as an example the EpsilonGreedy exploration class.

Hi @Lars_Simon_Zehnder, thanks for your reply.

Actually, my question was more about how to register my custom exploration to use it. I understand how to write a new custom exploration class, but then how can I add that to the trainer?

You add your exploration class to the Trainer config by seeting up an exploration_config:

"exploration_config": {
     "type": "MyCustomExplorationClass",
},

But that won’t work since the trainer doesn’t know where to look for my “MyCustomExplorationClass”

If the exploration class is not registered with RLlib, you must provide the class itself, as is shown in the MBMPO algorithm:

from mycustomexploration import MyCustomExplorationClass
"exploration_config": {
     "type": MyCustomExplorationClass,
},
1 Like

Oh I wasn’t aware that it is possible to provide the class directly. That solves my problem, thank you so much!

1 Like