Can Ray cluster be restarted during ray tune

I’m trying to use Ray tune to tune a heavy job’s hyper parameter, I want ray cluster to be re-started each trial is early stopped, or somehow bring the cluster into a clean state before trying another parameter set.

Any ideas?

does ray.shutdown() work for you?

not sure how to use it internally in ray tune


tuner1 = Tuner(...)
result1 =
tuner2 = Tuner(...)
result2 =

work for you?
I don’t think you should call it internally in ray tune.

oh, actually, what I really want is have a clean state during each trail with the fit. My application is quite complex, I don’t think each trial can be stopped early and hand over a clean ray cluster for next trial.

could you elaborate more? Some pseudo code could be helpful.

Example copied from Tune: Scalable Hyperparameter Tuning — Ray 2.2.0

from ray import tune

# 1. Define an objective function.
def objective(config):
    score = config["a"] ** 2 + config["b"]
    return {"score": score}

# 2. Define a search space.
search_space = {
    "a": tune.grid_search([0.001, 0.01, 0.1, 1.0]),
    "b": tune.choice([1, 2, 3]),

# 3. Start a Tune run and print the best result.
tuner = tune.Tuner(objective, param_space=search_space)
results =
# what I need, is during, where parameter space is explored,
# I want to ensure ray is in a clean state between each trail of parameter combination
# e.g., after trying {"a":0.001,"b":1}, shutdown ray, and then restart for next round of parameter testing

print(results.get_best_result(metric="score", mode="min").config)

I guess my question is what kind of “non-clean state” you are observing with each trial?

I haven’t tried with ray tune. But my current job run, whenever ends, before shutting down/restarting ray cluster, it seems object store is not empty. I’ll try it with tune to see if my concern is valid.