Multiple trials on each GPU

Hi,
is it possible to specify the number of trials to be run on each gpu concurrently? I know that my trials just take a small portion of my gpus so I want to be able to run multiple trials on each gpu. is it possible?

Yeah! tune.run(resources_per_trial={"gpu": 0.2}...)