I think you need to do something similar to this to have Tune take care of your checkpoint saved in a distributed setup.
xgb.train(
config,
train_set,
evals=[(test_set, "eval")],
verbose_eval=False,
callbacks=[TuneReportCheckpointCallback(filename="model.xgb")],
)
Notice the callbacks
section here.
See Tuning XGBoost parameters — Ray 1.13.0
As a side note, we are migrating away from this API in favor of Ray AIR. Please also take a look here: Hyperparameter tuning with XGBoostTrainer — Ray 3.0.0.dev0