I am curious whether Ray Train supports dynamic scaling like for example torchrun --nnodes=1:4
torchrun (Elastic Launch) — PyTorch 2.5 documentation
I see the ScalingConfig can take a Domain or a dict instead of just an int value, but the documentation does not say how to use those: ray.train.ScalingConfig — Ray 2.41.0