Confused about max_iters parameter in TuneGridSearchCV

I am confused about the max_iters parameter in ray.tune.sklearn.TuneGridSearchCV.

Assume I have e.g. an sklearn.ElasticNet trainable. In that trainable, I define the number of iterations that I allow the respective optimizer to run:

ELN = ElasticNet(max_iter=1000)

Now when I want to tune this trainable via ray.tune.sklearn.TuneGridSearchCV, I can define a max_iters which, according to the documentation “Indicates the maximum number of epochs to run for each hyperparameter configuration sampled.”. E.g. assuming some pre-defined parameter space (param):

ray.tune.sklearn.TuneGridSearchCV(estimator=ELN, param_grid=param, max_iters=100)

Now my question, formulated as an example: Let’s say I set max_iter=1000 and max_iters=100 as in the example above. Does my optimizer run for 100 iterations and the scheduler checks every iteration wether to stop or not, or does it run for 1000 iterations and the scheduler checks every 10 iterations whether to stop a trial or not?

I want to tune an ElasticNet object, such that the training stops if the scoring does not improve for 8 iterations, and I am currently confused how to do that.

Hey @F_S,

My understanding is that what will happen is that under the hood ELN.max_iter will be set to 10 (max_iter/max_iters), and tune-sklearn will call 100 (max_iters) times, with warm_start=True.

The result of each call to fit() can be processed with a Tune Stopper, so if you want more granularity, you may need to increase max_iters.

1 Like