[RLlib] Visualise custom environment

If your model input is pixel data, this may be of use: ray/logger.py at 0452a3a435e023eada85f670e70ffef02ceb5943 · ray-project/ray · GitHub

It looks like there is tensorboard support for videos in tune. Then you could visualize the environment along with the custom data. Using callbacks one could do something like:

from ray.rllib.agents.callbacks import DefaultCallbacks
class TBVideo(DefaultCallbacks):
  def on_train_result(self, *, trainer, result, **kwargs) -> None:
    result['custom_metrics'].update({'my_video': my_model.my_input_images.detach().numpy()})

I have not tested this. Perhaps @sven1977 could say if this is a good idea or not.