Py_executable in runtime_env ignored when starting Ray actor (ModuleNotFoundError)

Severity: High - Completely blocks me

Environment:

  • Ray version: 2.48.0

  • Python version: 3.12.11

  • OS: Ubuntu 22.04, no Docker

  • Infra: Ray autoscaler with AWS EC2


What I expected

I have a Ray actor defined in a package app_commons. When I try to instantiate it remotely with py_executable, it fails with:

ModuleNotFoundError: No module named 'app_commons'

Here’s the code:

from app_commons.features.queue_actor import QueueActor

actor = QueueActor.options(
    runtime_env={
        "py_executable": "uv run --with app-commons==0.1.10 --",
    },
    name=f"{feature}_queue_actor",
    num_cpus=0.1,
).remote(feature)

However, I launch other things in my cluster the exact same way in Ray Serve, and they work fine.

For example:

- name: ai_ocr_service
  route_prefix: /ai_ocr_service
  import_path: micro_services.ocr_service:app
  runtime_env:
    {
      "py_executable": "uv run --with micro-services==0.0.20 --with ai-ocr==1.3.9 --",
    }
  deployments:
    - name: OCRService
      max_ongoing_requests: 10
      autoscaling_config:
        target_ongoing_requests: 1

This works without any dependency issues.


My expectation

The actor should start without dependency issues because I am explicitly installing the dependency (package app-commons which provides the app_commons module). I almost feel like the py_executable I pass to the actor is ignored entirely, because I can replace it with lkjlkjio and the only error I get is still the ModuleNotFoundError — no additional error about an invalid command.

Question: Is py_executable in runtime_env actually supported for Ray actors the same way it is for Ray Serve YAML deployments? If not, what is the correct way to dynamically install dependencies for a Python package when creating an actor?

Thank you

@raphael if this is still a problem could you please file a github issue on the Ray repo with a reproduction script?