.get_env() in 2.0.0 for DQNTrainer

In previous versions of RLLIB one could run the following code to acess the environment in DQNTrainer classes

from ray.rllib.agents.dqn import DQNTrainer

# create DQNTrainer instance
dqn_trainer = DQNTrainer(config=config_fine_tuned, env=select_env)

# get the environment instance
env = dqn_trainer.get_env()

# access the environment attribute using dot notation
attribute_value = env.attribute_name

However, in 2.0.0 I get the following error

Traceback (most recent call last):
  File "C:\Users\Carlos-R.Perez\AppData\Roaming\Python\Python38\site-packages\IPython\core\interactiveshell.py", line 3433, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-7-6a7038882efe>", line 1, in <module>
AttributeError: 'DQN' object has no attribute 'get_env'

I am defining the agent as dqn.DQNTrainer(config=config_fine_tuned, env=select_env). What would be the equivalent to .get_env() in 2.0.0?

Hi @carlorop,

I cannot find in prior versions (e.g. ray-1.6.0) a get_env() method for neither the Trainable, nor the Trainer or DQNTrainer classes. What version are you referring to?


sub_envs = dqn_algo.workers.local_worker()
for env_id, sub_env in sub_envs:
      # do something