Ray tune log hyperparameters to tensorboard (solved)

Just started using Ray tune and got first experiment up and running in around 3-4 hours in PyTorch. I would like to see hyper-parameters for each run of ray-tune in tensorboard. I know that tensorflow has HParams which automatically supports this but can something similar be done for PyTorch easily without modifying my code intensively?

I am thinking because ray-tune internally does know which experiment used which hyper-parameters so it should not be too difficult to expose these parameters to tensorboard logs? Or is it not?


My bad, I see hparams are actually there!