Hello,
I am tuning hyper parameters using ray.tune with AsyncHyperBandScheduler scheduler.
Each trial is a training loop and ends with the evaluation of the test performances.
When trials are terminated by the scheduler, the trial is stopped and the test evaluation is skipped.
Is there a way to catch the trial termination and force the evaluation on the test set?
Here is a minimal example where I try to catch termination using try-except, which does not work. In this code, properly catching the termination of the trial should print “>>> test results computed before termination.” in the console.
How could I properly catch trial termination, so I can run the test()
function, even when the trial is terminated?
Thank you for your help!
import time
from ray import tune
from ray.tune.schedulers import AsyncHyperBandScheduler
MAX_T = 50
def experiment(config, *, checkpoint_dir=None, **kwargs):
alpha = config['alpha']
try:
train(alpha, MAX_T)
test(alpha, MAX_T)
except Exception as ex:
# force the run of the test function,
# even when the trial is terminated by the scheduler
test(alpha, MAX_T, interrupted=True)
raise ex
def test(alpha, max_t, interrupted=False):
test_accuracy = alpha * max_t
test_results = {'alpha': alpha, 'accuracy': test_accuracy}
if interrupted:
print(f">>> test results computed before termination.")
def train(alpha, max_t):
for t in range(max_t):
accuracy = t * alpha
tune.report(accuracy=accuracy)
time.sleep(1e-4)
scheduler = AsyncHyperBandScheduler(
time_attr='training_iteration',
max_t=MAX_T + 1,
grace_period=1)
runner = tune.run(experiment,
mode='max',
metric='accuracy',
config={"alpha": tune.uniform(0, 1)},
scheduler=scheduler,
resources_per_trial={'gpu': 0, 'cpu': 1},
num_samples=100,
fail_fast=True,
verbose=0,
)
print("Best hyperparameters found were: \n", runner.best_config)