[raytune] Use Repeater with BasicVariantGenerator

We are using ray 2.1 And are seeing errors with Repeater. Code to reproduce is below:

>>> from ray.tune.search.basic_variant import BasicVariantGenerator
>>> from ray.tune.search import Repeater
>>> algo=BasicVariantGenerator()
>>> algorithm = Repeater(algo, repeat=1)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/mtian/IdeaProjects/tf2-trainer_trunk/build/tf2-trainer-component-training/environments/development-venv/lib/python3.7/site-packages/ray/tune/search/repeater.py", line 127, in __init__
    metric=self.searcher.metric, mode=self.searcher.mode
AttributeError: 'BasicVariantGenerator' object has no attribute 'mode'

I think the root cause it that BasicVariantGenerator is a SearchAlgorithm. But repeater needs a Searcher. Is there a Searcher class for the default random search algorithm (BasicVariantGenerator)

This doesn’t seem to be supported right now. Can you open an issue on github?

As a workaround, you can use eg. OptunaSearcher or HyperoptSearcher with random samplers to achieve a similar outcome to the BasicVariantGenerator (though tune.grid_search is not supported with those).

Another workaround is to pass constant_grid_search=True to the BasicVariantGenerator and a "repeat": tune.grid_search(range(num_repeats)) parameter to the config/param space. This will sample the random values first and then iterate through all grid search parameters, keeping the random values constant.

If you have more grid search parameters, this may not be exactly what you’re looking for - let us know!

Is there a Github issue for this? If not, I can create one. I also have this problem

no. there is no issue created for it.

There is now an issue for this: [Tune] Allow use of tune.search.Repeater with BasicVariantGenerator to allow K-fold CrossValidation · Issue #33677 · ray-project/ray · GitHub

1 Like