Save and reuse Checkpoints in Ray 2.0 version

Hi @suraj-gade,

One thing to check for checkpoint.to_directory("folder_path"): is the folder path being specified as an absolute path? Tune will change the working directory by default to the trial directory (or the worker directory under the trial directory if using Train), so you’ll need to find it under the experiment log directory (ex: ~/ray_results/experiment-name/trial-dir/).

Another way to access the checkpoints from another script is to get them from the experiment results:

results = Tuner.restore(path).get_results()
results[0].best_checkpoints. # [(checkpoint, metrics), ...]

See Analyzing Tune Experiment Results — Ray 3.0.0.dev0 for more info.