Early stopping rules for ASHAScheduler

How severe does this issue affect your experience of using Ray?

  • None: Just asking a question out of curiosity

I’m using Ray Tune with PyTorch Lightning and am a little confused about how the early stopping rules combine. I’m using both ASHAScheduler from Ray and EarlyStopping from PyTorch Lightning.

If I set max_t parameter of ASHAScheduler very high, does the training the best model continue

  • until Lightning stops the training?
  • until ASHAScheduler stops the training?
  • until ASHAScheduler or Lightning stops the training (which ever comes first)?
  • until max_t?

I tried reading this guide and docs for ASHAScheduler but couldn’t find answers.

Hey @EliasR, training will be terminated if any of the conditions are hit!

Thanks for the quick answer. Is there a way to report why the trials were terminated?

Unfortunately we currently don’t report the reason for trial terminations - maybe you can infer this from the metrics (e.g. iteration number)?

I’ll try to figure something out. Thank you