[RLlib] Need help in connecting policy client to multi-agent environment

We are creating the trainer with PolicyServerInput and using Policy Client for creating rollout workers. We want to create multiple policy types for our multi-agent environments (ExternalEnv). We are using the similar approach of serving unity3d (unity3d_server.py and unity3d_client.py). Specifically, we are interested in SoccerStrikersVsGoalie environment. Is there a way to create a policy client or worker to have only one policy instance (either Striker policy Or Goalie policy, not both).

We want to understand :

  1. How to connect to a specific policy of trainer from the policy client/worker (where policy worker is allowed to connect to one policy and it should get its own environments obs, action, reward)
  2. Is there a way to restrict the worker to have its own environment view
  3. How to make stepping of the environment by all the workers independently from their respective clients/workers.
2 Likes