Rllib callback breaks with missing argument

Hi there,

I am trying to use the custom callback functionality however I encounter a strange error:

on_episode_step() missing 1 required keyword-only argument: 'policies'

Here’s my code:

class MyCallbacks(DefaultCallbacks):
    # taken from the example at: https://github.com/ray-project/ray/blob/master/rllib/examples/custom_metrics_and_callbacks.py
    def on_episode_step(self, *, worker, base_env, policies,
                        episode, env_index, **kwargs):
        # Make sure this episode is ongoing.
        assert episode.length > 0, \
            "ERROR: `on_episode_step()` callback should not be called right " \
            "after env reset!"
        print('test')

config = sac.DEFAULT_CONFIG.copy()
config["callbacks"] = MyCallbacks

trainer = sac.SACTrainer(config=config, env="LunarLanderContinuous-v2")

for i in range(5):
    # Perform one iteration of training the policy with PPO
    result = trainer.train()

Seems like the callback is not called with the right arguments (i.e. it is called without providing the policies argument) which is very strange to me…

Any help would be greatly appreciated.

Best Regards,

1 Like

Hi @drlatvia

Can you provide the whole error output with the stack trace?