How severe does this issue affect your experience of using Ray?
- Low: It annoys or frustrates me for a moment.
I have a callback
ExcelWriterCallback(DefaultCallbacks) that inherits
from ray.rllib.agents import DefaultCallbacks.
And I overwrite the
def on_trainer_init(...) function so that I can create the excel file once with some columns and then with
def on_episode_end(...) append every time a new row of data to the excel file.
on_trainer_init(...) never gets called. I use
The way I do it now is using the
log_trial_start(...) to initialize the excel sheet and the Defaultcallbacks to fill the data but this is not so clean. However, I need access to my custom gym environment for the data so therefore I use
on_trainer_init(...) not get called and is there another way to initialize an excel sheet once for every run that is cleaner?