Spread trials evenly with fractional gpu resources

How severe does this issue affect your experience of using Ray?

  • Medium: It contributes to significant difficulty to complete my task, but I can work around it.

I’m using tune.run with resources_per_trial=dict(gpu=0.2) on a cluster with 2 GPUs. Unfortunately, ray places the first five trials on the same GPU. I would like to spread the trials evenly on both GPUs. And I’m not setting the required fractional gpu higher because I don’t know how many trials I’ll launch in advance. Is it possible to do this?