Tuner not returning best checkpoint

High - impacting my task and am blocked

I am currently creating a lightgbm trainer for multiclass classification and want to perform tuning on it with the Tuner class.

once I call tuner.fit, I get the result grid, and then try to call the best result’s checkpoint. However that checkpoint is None. Is there a specific flag I need to set in order to retrieve the checkpoint?

Moved. This is a tune question.

can you share your script?

Hi @Akarsh_Bhagavath,

This section of the Train user guide on XGBoost/LightGBM might help you out: XGBoost & LightGBM User Guide for Ray Train — Ray 3.0.0.dev0. Checkpointing may not be enabled for you at the moment.

Let me know if that solves the issue!