Restoring from checkpoint

I was trying to restore a checkpoint using the restore functionality and I have the code below

def restore_agent(
        self,
        checkpoint_path:str='',
        restore_search: bool = False,
        resume_unfinished: bool = True,
        resume_errored: bool = False,
        restart_errored: bool = False,
    ):
        # if restore_search:
        # self.search_alg = self.search_alg.restore_from_dir(self.local_dir)
        if checkpoint_path == '':
            checkpoint_path = self.results.get_best_result().checkpoint._local_path

        restored_agent = tune.Tuner.restore(
            checkpoint_path,
            restart_errored=restart_errored,
            resume_unfinished=resume_unfinished,
            resume_errored=resume_errored,
        )

        self.results = restored_agent.fit()

        self.search_alg.save_to_dir(self.local_dir)
        return self.results

Then I did

res = drl_agent.restore_agent(checkpoint_path=best_result.checkpoint._local_path)

But I am getting an error that the correct path was not passed. What is the issue here? The docs are not clear on what to pass to the checkpoint path.
Also, the restore scheduler and search space are a bit confusing in what it tries to do and what are the best practices are if we run multiple tune runs.

I never tried it the way you are doing in your script, but when i load from checkpoint for serving i do it like this:

# DQN
algo_cls = get_algorithm_class("DQN")
algo = algo_cls(config=config)
checkpoint_path = "/path_to_folder/checkpoint_000225/rllib_checkpoint.json"
algo.restore(checkpoint_path)

# PPO
algo_cls = get_algorithm_class("PPO")
algo = algo_cls(config=config)
checkpoint_path = "/path_to_folder/checkpoint_000225/checkpoint-225"
algo.restore(checkpoint_path)

Hi @Blubberblub @amogkam
But here I am doing hyperparameter optimization, so I have the config file and I can’t pass the default config file.