Reporting custom validation and training metrics

I am new to ray tune, I was able to run a Keras neural network with grid search using ray tune. However, I am unable to report custom metrics. I am interested in report training and validation AUC values apart from the accuracy. I would also like to use AUC as a stopping criteria as tune is overfitting when I use only accuracy. How do I first calculate and then report training and validation AUC ?

cc @kai Can you address the question?

Hi @gagasth, can you share the code you’re currently running? Especially the trainable function would be interesting.

Generally you report custom metrics via in the trainable function.

Hi @kai, thanks for the response. I am looking to stop if (testAUC - trainAUC > 0.1) I have attached a screenshot of my code. Not seen in the screen shot is the dataloader function and the libraries I have imported.

List of libraries
from ray.tune.schedulers import AsynHyperBandScheduler
from ray.tune. integration.keras import TuneReportCallBack

@kai and @sangcho : Just checking to see if you had any questions or solutions for me? Thank you.

Hi @gagasth, I’ve actually looked into this. The problem here seems to be that Keras callbacks either report the training results or the test results, but not both at the same time. Thus the TuneReportCallback might not work for your use case.

You need to get the training and test evaluation scores together. You could potentially use a stateful Keras callback for that.

Basically you’ll need to have train_auc and test_auc and the do something like, test_auc=test_auc, auc_diff=test_auc-train_auc).

For early stopping, you’ll then have to implement a custom stopper to implement that specific early stopping mechanism. Alternatively you can just stop training within the trainable function - i.e. by setting self.model.stop_training = True in the callback.

Ok. Thank you. I am new to this and an example will be helpful. Would you happen to have an example for using with stateful keras callback?