Hi,
If I have multiple jobs, with each job having a conflicting dependency. How do I run them in a single Ray cluster?
Regards,
Raghu
Hi,
If I have multiple jobs, with each job having a conflicting dependency. How do I run them in a single Ray cluster?
Regards,
Raghu
We don’t support any hard dependency isolation, but there’s a way to set the environment variables for all workers used in the same job (in this way, you can specify the conda env for each job. It is not publicly available though). Is this type of support sufficient for your use case?
Thanks @sangcho - that would be a good starting point for us. Can you please provide a pointer and how to use?
We currently only have this experimental API for actor. You can use
Actor.options(override_environment_variables={"KEY": "VALUE"}).remote()
to override environment variable for an actor and all the tasks/actors it subsequently launches.
It is also possible to set the environment variables per job using
from ray.job_config import JobConfig
ray.init(job_config=JobConfig(worker_env={"EnvKey": "EnvVar"}))
This will set the environment variables for all workers used for the job (Note: Same workers are not shared among jobs).
@raghukiran It will be really awesome if you can share your experience with this API. We can probably consider creating more higher level APIs (which Ray.Serve already did with Simon’s per-actor env variable)