Is there a way to run the same hyperparameter configuration multiple times?

I’m doing a simple hyperparameter random search where I want to try, say, 5 random hyperparameter configurations and run 5 trials for each configuration. Is there a way to do that easily?

try:

config = {"repeat": tune.grid_search(list(range(10))}

Is “repeat” a special key or something?

Nope, its just a random variable. Sorry about that!

Ah, I see. Setting it to grid search would cause the other settings to stay the same for each setting of the dummy variable. Gotcha

This doesn’t appear to be working, it just multiplies the number of total trials by “repeat”.

Hmmm, weird – it should multiply it but also have the same trials run. What about trying using a Repeater?

Yeah a Repeater seems to be what I’m looking for. Minor gripe, I wish it defaulted to random search if no Searcher is specified. BasicVariantGenerator doesn’t subclass Searcher so I cant easily use it with random search.

OK that’s a good point. It’s odd that the random search / tune.grid_search doesn’t work for you though. If you have a chance, could you let me know what is output instead?

The problem is this line in the constructor, since it doesn’t have a mode or metric attribute:

super(Repeater, self).__init__(
            metric=self.searcher.metric, mode=self.searcher.mode)

I made this simple random searcher to get around it:

class RandomSearcher(Searcher):
    def __init__(self, space: Dict[str, Domain], metric: Optional[str] = None, mode: Optional[str] = None):
        super().__init__(metric, mode)
        self._space = space

    def on_trial_complete(self, trial_id: str, result: Optional[Dict] = None, error: bool = False):
        pass

    def suggest(self, trial_id: str) -> Optional[Dict]:
        setting = {}
        for k, v in self._space.items():
            setting[k] = v.sample()
        return setting

    def save(self, checkpoint_path: str):
        pass

    def restore(self, checkpoint_path: str):
        pass

    def get_state(self) -> Dict:
        return {}

    def set_state(self, state: Dict):
        pass

Have you had a chance to look at this?

No, I haven’t. Alternatively, you could also use BasicVariantGenerator(points_to_evaluate=... to manually inject the variables that you want to use.

The RandomSearcher class I made seems to be working just fine for me. I just wanted to ping you again because I didn’t know if you wanted me to submit an issue/feature request

OK got it! Thanks. I already created this issue here: [tune] Support resolving grid search variables before random samples · Issue #15126 · ray-project/ray · GitHub

1 Like