On_trainer_init from DefaultCallbacks is not called with RLlib and Tune

How severe does this issue affect your experience of using Ray?

  • Low: It annoys or frustrates me for a moment.

I have a callback ExcelWriterCallback(DefaultCallbacks) that inherits from ray.rllib.agents import DefaultCallbacks.

And I overwrite the def on_trainer_init(...) function so that I can create the excel file once with some columns and then with def on_episode_end(...) append every time a new row of data to the excel file.
However, on_trainer_init(...) never gets called. I use tune.run(SimpleQTrainer, ...).

The way I do it now is using the LoggerCallback log_trial_start(...) to initialize the excel sheet and the Defaultcallbacks to fill the data but this is not so clean. However, I need access to my custom gym environment for the data so therefore I use DefaultCallbacks.

Why does on_trainer_init(...) not get called and is there another way to initialize an excel sheet once for every run that is cleaner?

I use the MultiCallback functionality and I discovered that as of Ray 1.12.1 the on_trainer_init(...) function is not yet called but this seems fixed in the master branch now: [RLlib] Ensure `MultiCallbacks` always implements all callback methods by XuehaiPan · Pull Request #24011 · ray-project/ray · GitHub

1 Like