There is an issue on github that asks about getting the env on RLLIB [rllib] Getting the env object from trainer · Issue #8700 · ray-project/ray (github.com). The solution given is calling:
trainer = dqn.DQNTrainer(config=config, env="my_env")
However, the solution does not work in rllib 1.13.0 since
env is not an attribute of
local_worker() anymore. How can I retrieve attributes from the env in rllib 1.13.0?