Hparam tensorboard logging with pytorch?

On the tensorboard page it states

“If using TF2, Tune also automatically generates TensorBoard HParams output, as shown below:”

Is it possible to get this to work when using pytorch (specifically pytorch lightning), I’ve tried self.save_hyperparameters() and I’ve managed to save the hparams but I’ve not found a way to pass metrics properly. I’ve tried the pytorch lightning function

self.logger.experiment.add_hparams(self.hparams_dict, metrics_dict)

but this doesn’t work inside callbacks so its become unclear how to pass the metrics properly?

I’m wondering how to do this as well. From every indication in the docs it seems like it should happen automatically, at least as far as for the metrics you have Ray Tune observe in the TuneReportCallback instance. Any help would be greatly appreciated, I’m not sure if I can really use Ray Tune until this is resolved


I think I found a bug in TBXLoggerCallback which is causing this to happen. Please see the issue I opened: