Policies' directory not present in saved checkpoint

How severe does this issue affect your experience of using Ray?

  • High: It blocks me to complete my task.

Hello,

After switching to the new API version, the checkpoint directory no longer includes the policies folder. Do you have any insights on why this might be happening?

Currently, the checkpoints contain the following directories and files:

  • env_runner
  • learner_group
  • algorithm_state.pkl
  • class_and_ctor_args.pkl
  • rllib_checkpoint.json

Any help would be appreciated!