Ensure raytune would run a function even if the trial is early stopped

Hi! Ray Tune is really easy to use and extremely powerful. I had one question about how to make ray tune run a function even if this trial is early stopped when training a DL model.
I am using PyTorch lightning and followed the tutorial on this page. My code is something like


def train_and_test(config):
    ....
    trainer.callbacks.append(
           TuneReportCallback(
                {
                    "loss": "valid_loss_epoch",
                    "mean_accuracy": "valid_acc_epoch"
                },
                on="validation_end"
            )
    )
    trainer.fit(model, dataloader)
    trainer.test()


def tune(config):
    hyperparams_to_tune = {
        "lr": tune.loguniform(1e-5, 1e-3)
    }

    scheduler = ASHAScheduler(...)
    reporter = CLIReporter(
        parameter_columns=list(hyperparams_to_tune.keys()),
        metric_columns = ["loss", "mean_accuracy", "training_iteration"]
    )
    
    analysis = tune.run(
        run_or_experiment=tune.with_parameters(
           train_and_test,
        ),
        metric="loss",
        mode="min",
        config=hyperparams_to_tune,
        num_samples=num_samples,
        scheduler=scheduler,
        progress_reporter=reporter,
    )

Currently, since ASHA supports early stopping, I think if ray figures one trial is not promising it would stop the trial during the training (in trainer.fit function), and thus trainer.test() would not be able to run. Is there any way to ensure that raytune would guarantee to run this test for each trial even if this trial would be early stopped?

Thanks a lot!

Hmm, maybe you can try incorporating it in a try-finally block, with an ‘atexit’ handler, or using a ‘with’ context?

All of these should work and will run a final routine upon trial stopping. See: ray/test_trial_scheduler.py at 757866ec01985b6c3477c6a7215b75f630b1c9af · ray-project/ray · GitHub