How severe does this issue affect your experience of using Ray?
Medium: It contributes to significant difficulty to complete my task, but I can work around it.
Hi! I am currently working on a project with the Gazebo Simulator and want to use RLlib to handle the reinforcement learning part.
I was currently looking into external environments and how i could create a wrapper for Gazebo. The example that is mentioned in the documentation (Environments — Ray 2.2.0) of the cartpole environment as an external environment however does not work and throws an Error.
I can start the cartpole_server.py just fine, but the cartpole_client.py throws the following error:
Env checking isn’t implemented for RemoteBaseEnvs, ExternalMultiAgentEnv, ExternalEnvs or environments that are Ray actors.
Traceback (most recent call last):
File “/home/ydenker/Repositories/pushing-robot/RLlib/cartpole_example_client.py”, line 81, in
client = PolicyClient(
File “/home/ydenker/.local/lib/python3.10/site-packages/ray/rllib/env/policy_client.py”, line 79, in init
self._setup_local_rollout_worker(update_interval)
File “/home/ydenker/.local/lib/python3.10/site-packages/ray/rllib/env/policy_client.py”, line 261, in _setup_local_rollout_worker
(self.rollout_worker, self.inference_thread) = _create_embedded_rollout_worker(
File “/home/ydenker/.local/lib/python3.10/site-packages/ray/rllib/env/policy_client.py”, line 405, in _create_embedded_rollout_worker
rollout_worker = RolloutWorker(**kwargs)
File “/home/ydenker/.local/lib/python3.10/site-packages/ray/rllib/evaluation/rollout_worker.py”, line 826, in init
self.sampler = SyncSampler(
File “/home/ydenker/.local/lib/python3.10/site-packages/ray/rllib/evaluation/sampler.py”, line 246, in init
self._env_runner_obj = EnvRunnerV2(
File “/home/ydenker/.local/lib/python3.10/site-packages/ray/rllib/evaluation/env_runner_v2.py”, line 236, in init
raise ValueError(
ValueError: Policies using the new Connector API do not support ExternalEnv.
Is there a quick fix for this? If i can’t get the example to work how am i suppose to create my own adapter class that is working? Thanks for help in advance.
Here are some more details about what i am using:
Ray version: 3.0.0.dev0 (commit: a830694359a272703aace4ac6f7e5f98b1d5d4a9)
OS: Ubuntu 22.04.1 LTS
Python version: Python 3.10.6
I have not edited the cartpole_client.py file in any way. It is an exact copy of:
@Lars_Simon_Zehnder i uninstalled and reinstalled using your command and the error persists.
I don’t really know what i am doing differently, if i am doing anything different at all.
The Error seems to revolves around “Env checking”:
‘Env checking isn’t implemented for RemoteBaseEnvs, ExternalMultiAgentEnv, ExternalEnvs or environments that are Ray actors.’
What exactly is Env checking and how can it not be supported if it works on your end?
The resulting error hints at the new Connector API not supporting what i’m planning to do:
‘ValueError: Policies using the new Connector API do not support ExternalEnv.’
Can i maybe circumvent the issue by using the old Connector API assuming there is such a thing?
Is the error even related to those at all and i’m just missing something crucial?
I’m not sure how to proceed from here. Any help is much appreciated.