ModuleNotFoundError for custom env import when num_workers > 0

Hi all,

In the config of my PPOTrainer I had set "num_workers": 0 so far, but when I change it to num_workers > 0 then it raises a ModuleNotFoundError: No module named … for the import of my custom env. Is this issue the same as mentioned here?

Yet, I’m not familiar with all the effects of num_workers > 0. I’m working on a single machine (desktop computer) and don’t know if num_workers > 0 works in this case anyway? Or is it just possible setting for remote workers running on multiple machines?
Some enlightening thoughts are really appreciated.

The actor died because of an error raised in its creation task, e[36mray::RolloutWorker.init()e[39m (pid=16960, ip=
File “python\ray_raylet.pyx”, line 460, in ray._raylet.execute_task
File “python\ray_raylet.pyx”, line 481, in ray._raylet.execute_task
File “python\ray_raylet.pyx”, line 351, in ray._raylet.raise_if_dependency_failed
ray.exceptions.RaySystemError: System error: No module named ‘galvcon’
traceback: Traceback (most recent call last):
File “C:\Users\user\AppData\Local\Programs\Python\Python38\lib\site-packages\ray\”, line 248, in deserialize_objects
obj = self._deserialize_object(data, metadata, object_ref)
File “C:\Users\user\AppData\Local\Programs\Python\Python38\lib\site-packages\ray\”, line 190, in _deserialize_object
return self._deserialize_msgpack_data(data, metadata_fields)
File “C:\Users\user\AppData\Local\Programs\Python\Python38\lib\site-packages\ray\”, line 168, in _deserialize_msgpack_data
python_objects = self._deserialize_pickle5_data(pickle5_data)
File “C:\Users\user\AppData\Local\Programs\Python\Python38\lib\site-packages\ray\”, line 158, in _deserialize_pickle5_data
obj = pickle.loads(in_band)
ModuleNotFoundError: No module named ‘galvcon’

Hi @klausk55,

Yes you can run with more than 1 worker even on a single machine.

Each worker runs in a separate process. What you are likely seeing is that the workers do not have the same path or environment variable set and cannot find your module.

Try asking in the core category.

1 Like

Now my question reassigned to the hopefully appropriate category.

How to fix what @mannyv identified as the problem?

You are mentoining custom env, this is likely to be related. Can you give an example how you create this env?

@TanjaBayer I have a file called where I import the custom RL env, i.e. from my_module.sub_folder.sub_sub_folder.my_env import MyEnv. In I create a RL Trainer which is given this custom env. When I have "num_workers": 0 in the Trainer config then everything works fine since there is only a “local worker”, but with "num_workers" > 0 (i.e. “local worker” + “remote workers”) the error occurs.
As a first workaround I copied the folder of my_module into the rllib folder and changed the import in from ray.rllib.my_module.sub_folder.sub_sub_folder.my_env import MyEnv

Is this what you wanted to know?