How severe does this issue affect your experience of using Ray?
- Low: It annoys or frustrates me for a moment.
I’m super happy with the stability on dask on ray on kuberay but at the moment I think the only way to control the amount of parallelism is by tweaking the number of dask array chunks (e.g., one chunk == 1 task?)
Is there another way? Because if I have a (delayed) 1 PB dask array the chuck size is getting quite large (exceeding one worker’s main memory) to make sure not too many tasks are running in parallel