StatusCode.RESOURCE_EXHAUSTED

you are not supposed to supply config into tune.with_parameters.

Simply change it to tune.with_parameters(func, train_data=xxx, test_data=yyy) should do.

It works, thank you and actually I just found out we can use the tune.with_parameters inside the TorchTrainer.

trainer = TorchTrainer(
             tune.with_parameters(train_func, params1=args1, params2=args2, ...),
             scaling_config=ScalingConfig(num_workers=x, use_gpu=True),
)

this works!
and it works even only for that purpose; if your train_func has some parameters to be passed, at first, I thought tune.with_parameters will only work for hyperparams tuning.

I didnā€™t see the above use case in ray example, if itā€™s not there, maybe adding that would be good reference for future readers.

1 Like