I cannot for the life of me figure out how to get a reference to my environment. Please Help

To sum it up, I need to access a reference to one of my environments but it is seeming impossible. I’m doing a multiagent environment and I have tried several things. The top recommended thing that I found on the internet was to do trainer.workers.local_worker().env but this always returns None. I’ve also tried registering environments, running both in and out of localmode and somehow trying to get the reference by using envcontext. I’m very lost and any direction is much appreciated

Hi @Randy-Hodges, you should set create_env_on_local_worker=True in the config.

config = config.rollouts(create_env_on_local_worker=True)

Sweet that helped a lot. For reference, how would I have found that out for myself?

Also, if I needed to get the env of a specific worker, how would I do that?

@Randy-Hodges, It requires some basic understanding of how RLlib uses ray to distribute its samplers. I guess it also depends on the algorithm. For most algorithms the flag is enabled by default so the thing you just had would just work out of the box. I am curious about which algorithm you are using here, maybe we can upgrade the default values to be more intuitive.

Also, if I needed to get the env of a specific worker, how would I do that?

You cannot get the reference to the env instance of specific workers, since they are attributes of ray actors which would have to be called using remote calls. The rollout workers use them internally to collect samples. If you need to debug you can enable that flag that I told you about and debug your env this way. But keep in mind that generally you should not rely the logic of your workloads on these implementation details as they may not be future compatible.