How severe does this issue affect your experience of using Ray?
- Medium: It contributes to significant difficulty to complete my task, but I can work around it.
I have a train.py file that includes ray.init(), tuner.fit() and the following training things.
Now, I want to run the train.py with two Linux shell commands at the same time.
Basically, I want the two runnings to be totally separate: saving to different ray logs and so on.
But I found that:
- The first training will be blocked.
- The ray logs seem to have been put into the same folder: storage_path/[trainin_function_name_time].
Is there anything I can do to allow me to use ray parallelly on the same machine?
Best,
Sean