Getting the env object from trainer on ray 1.13.0

There is an issue on github that asks about getting the env on RLLIB [rllib] Getting the env object from trainer · Issue #8700 · ray-project/ray (github.com). The solution given is calling:

trainer.workers.local_worker().env

Where

trainer = dqn.DQNTrainer(config=config, env="my_env")

However, the solution does not work in rllib 1.13.0 since env is not an attribute of local_worker() anymore. How can I retrieve attributes from the env in rllib 1.13.0?