Ray serve deployment on static ray cluster

  • None: Just asking a question out of curiosity

I have created my own docker images where i use as the entry point:
ray start --head (for the head node) and the equivalent ray start for the ray worker nodes.
i deploy it statically using the same concept as (Advanced) Deploying a static Ray cluster without KubeRay — Ray 2.40.0

I have been using ray core capabilities until now with ray client Ray Client — Ray 2.40.0 and everything is working

I want to try out ray serve
I am wondering how i can run my deployments in this constilation (my own images + static ray cluster)
if I missed something in the documentation I would appriciate if someone could point it out to me

i am pretty sure i need to follow Deploy on VM — Ray 2.40.0
was wondering a few things

  1. why is there no --blocking argument like there is in serve run?
  2. is serve run actually ray --head + serve deploy?
  3. if i run ray --head and serve deploy, and only after that add more vms/containers to my cluster, am i correct in assuming that the deployment will distribute into the late joiner?