Newbie question: local head node with remote nodes

I am rather confused about this library…I have been using it happily for a few months to run stuff locally, but now I would like to employ some CPU I have just rented in the cloud.

I would like to know if I can use my local machine as the head node of my cluster, and the remote machines as the worker nodes. Something like this in my YAML:

provider:
    type: local
    head_ip: 127.0.0.1
    external_head_ip: 135.181.0.0
    worker_ips: [135.181.0.0]

I have had the cluster configured entirely remotely, without the local IP or the worker_ips configured, and that worked, but I want to use my local machine as the head node.

Is this a thing? This is how it has been working in my imagination :rofl: my code queries a local database, packages up the data for processing using ray.put(), then sending it to the remote VM using mymethod.remote().

Is this wildly off the mark? Can I use my local machine together with remote machines like this? Where would I find examples for this pattern?

I have also the same question. For my use case I also want to use my local machine as the head node and remote machines for worker nodes. Is it possible?