Can not use two agents of PPO

    register_env(paramfile['env'], _env)
    register_custom_model(paramfile['model'])
    ray.init(num_cpus=1,num_gpus=0) 

    agent1= ppo.PPOTrainer(config=paramfile, env=paramfile['env'])

    agent2= ppo.PPOTrainer(config=paramfile, env=paramfile['env'])

Would like to load two PPO agents to load two different trained networks with different structures. But the error is:

*** mlagents_envs.exception.UnityWorkerInUseException: Couldn’t start socket communication because worker number 0 is still in use. You may need to manually close a previously opened environment or use a different worker number.

Could you help to show how to use a different worker number

This seems like a detail with regards to your environment and not necessarily RLlib.

Something that you could do here is follow the rllib examples on our github and website to figure out how to pass parameters to your environment upon its creation (search for the env config)