How do RayTrainer processes store data with the same name through a shared container?

I have a question. When using RayTrain to implement multi-process training, for example, each process will get test_loss, but I want to collect the value of each process and store it in a container for subsequent use. Instead of printing it out. Is there any way to make this container shared among RayTrainers?
Similar to the use of multiprocessing to achieve multiple processes, through quene=Quene(), multiple processes can put data in the shared quene.

Hey @YeahNew you should be able to use the Ray Train Callbacks for this Ray Train User Guide — Ray v1.8.0.

Inside the training function you can report any data via train.report(...) and then create a custom callback to aggregate the data that was reported from each worker.

Would this work for your use case?