Using py_modules URI to set up dependencies per task failed for packages that includes *.so files as numpy/pillow

We’re using runtime_env to set up dependencies via py_modules per task, with a URI to a remote zip file.
We’re doing so for all our required packages and it’s 3rd parties dependencies.

we encountered the following issue, while trying to pass 3rd parties packages such as ‘numpy’ or ‘pillow’ via URI since this packages includes directories with *.so files :

ImportError: cannot open shared object file: No such file or directory

Installation of ‘pillow’ generates 2 directories: pillow and pillow.lib, we’re passing both directories via URI (each via a different zip), and we see both directories were unpacked to /tmp/ray/session_latest/runtime_resources.

We are installing the pillow and creating the zip file in a container with the same exact ubuntu version as the one in the ray container (so it should be compatible in regards to glibc)

From our docker:

root@ray-algo-plugin-worker1-647bdb68bd-48vmh:/tmp/ray/session_2023-11-05_14-28-31_240819_7/runtime_resources/py_modules_files/s3_dependencies_Pillow_libs# ls


root@ray-algo-plugin-worker1-647bdb68bd-48vmh:/tmp/ray/session_2023-11-05_14-28-31_240819_7/runtime_resources/py_modules_files/s3_dependencies_Pillow_libs# cd Pillow.libs/

root@ray-algo-plugin-worker1-647bdb68bd-48vmh:/tmp/ray/session_2023-11-05_14-28-31_240819_7/runtime_resources/py_modules_files/s3_dependencies_Pillow_libs**/Pillow.libs**# ls

But still we got import error:

ImportError: cannot open shared object file: No such file or directory

  1. We cannot use ‘pip’ for passing 3rd parties packages
  2. We’re looking for a solution per actor/task (and not per job)

Appreciate your help with this issue

Hi @Ruthy this seems like a system package and it seems runtime env doesn’t set up the LD_LIBRARY_PATH, you might want to conda install the libraries or just put the system library into your image.

Feel free to create a ticket to request this feature.