Specifying Multiprocessing Context in Ray Worker

Can we start ray remote worker with a spawn context?

The module I am working with could be called multiprocessing like this(following Multiprocessing best practices — PyTorch 1.10.1 documentation):

context=mp.get_context('spawn')#NOTE: without spawn context will fail as it uses CUDA!
self.ps = context.Process(target=worker, args=(...))

And I wonder if there’s any equivalence of it in ray. There’s a similar issue but I cannot figure out the right way to do this.

I tried to directly specify mp.set_start_method('spawn') at the beginning of main(), however, it does not work.