Different resource amount for different trials

Currently we can set resource_per_trial to indicate the same required resource for each trial. Is it possible to set different required resource for each trial specifically?

Hmm, possibly – what are you trying to achieve?

We are running different models in parallel, one that uses GPU, one that doesn’t. It’ll be great to set different resource limit for different trials so we won’t spend GPU on non-GPU models.

I wonder if you can just use a custom resource function that uses the tune sample_from operator –

resources_per_trial=tune.sample_from(lambda spec: {"gpu": 1} if spec.config.use_gpu else None)


1 Like

Thanks Richard, will this work for HyperOptSearch?

Hmm, not sure actually. If it doesn’t it’d be great to post a feature request :slight_smile:

This works great for me! I didn’t realize you could use sample_from with resources_per_trial. How does this work under the hood?_Is this noted in the documentation anywhere? I wasn’t able to find it

Haha… it’s a bit of a weird use case. We just treat some part of extended part of the Tune run config as a “hyperparameter space” and do resolution across the entire thing.

We should update it accordingly; would you mind helping opening a issue on github for us to track it?

makes sense! just opened here [tune] Clarify documentation around using different resource requirements across trials · Issue #17088 · ray-project/ray · GitHub