TuneError: ('Trials did not complete'...)

Hi, I am working with RLLIB for a custom env. I trained a DQN agent by the train (I mean agent.train()) method and it went well. Now, I would like to train the same env with ray.tune.run, but I got the following error:

TuneError: ('Trials did not complete', [DQN_nesting_bad7c_00000, DQN_nesting_bad7c_00001, DQN_nesting_bad7c_00002, DQN_nesting_bad7c_00003])

nesting is the name of my custom env. Would you please let me know what is the issue? While it works with train method, why it does not work with tune method?

Thanks

<IPython.core.display.HTML object>
2021-07-05 19:21:07,501	ERROR trial_runner.py:793 -- Trial DQN_nesting_5ff37_00000: Error processing event.
Traceback (most recent call last):
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\tune\trial_runner.py", line 726, in _process_trial
    result = self.trial_executor.fetch_result(trial)
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\tune\ray_trial_executor.py", line 489, in fetch_result
    result = ray.get(trial_future[0], timeout=DEFAULT_GET_TIMEOUT)
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\worker.py", line 1452, in get
    raise value.as_instanceof_cause()
ray.exceptions.RayTaskError(ModuleNotFoundError): ray::DQN.train() (pid=6656, ip=192.168.213.146)
  File "python\ray\_raylet.pyx", line 443, in ray._raylet.execute_task
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\worker.py", line 186, in reraise_actor_init_error
    raise self.actor_init_error
  File "python\ray\_raylet.pyx", line 477, in ray._raylet.execute_task
  File "python\ray\_raylet.pyx", line 481, in ray._raylet.execute_task
  File "python\ray\_raylet.pyx", line 482, in ray._raylet.execute_task
  File "python\ray\_raylet.pyx", line 436, in ray._raylet.execute_task.function_executor
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\function_manager.py", line 553, in actor_method_executor
    return method(actor, *args, **kwargs)
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\rllib\agents\trainer_template.py", line 106, in __init__
    Trainer.__init__(self, config, env, logger_creator)
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\rllib\agents\trainer.py", line 477, in __init__
    super().__init__(config, logger_creator)
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\tune\trainable.py", line 249, in __init__
    self.setup(copy.deepcopy(self.config))
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\rllib\agents\trainer.py", line 562, in setup
    self.env_creator = _global_registry.get(ENV_CREATOR, env)
  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\tune\registry.py", line 140, in get
    return pickle.loads(value)
ModuleNotFoundError: No module named 'rl_pipeline'
<IPython.core.display.HTML object>
Traceback (most recent call last):

  File "D:\ETHZ\housing_design\rl_pipeline\agent_rllib\train_nesting_tetris.py", line 163, in <module>
    tuner_analysis = tuner(env_config)

  File "D:\ETHZ\housing_design\rl_pipeline\agent_rllib\train_nesting_tetris.py", line 118, in tuner
    tuner_analysis = ray.tune.run(dqn.DQNTrainer,

  File "C:\Users\Reza\Anaconda3\envs\ray\lib\site-packages\ray\tune\tune.py", line 434, in run
    raise TuneError("Trials did not complete", incomplete_trials)

TuneError: ('Trials did not complete', [DQN_nesting_5ff37_00000])

Hi @deepgravity ,

I am not very experienced with rllib so far, but from the output you posted I think most people cannot read enough. Maybe you could run tune with a config that sets the log_level to DEBUG?

Best regards

Hi, thanks for your reply. I just added more info to the first message.

Hi @deepgravity,

This issue indicates that rl_pipeline is not on your PYTHONPATH. Pickle is trying to reconstruct the python object but it cannot find its definition. Are you running on multiple nodes? If so have you made sure the files exist in the same location and it is in your PYTHONPATH?

Thanks for your reply. Yes, I noticed that error, and already added the rl_pipleline to the PYTHONPATH. And the thing is that the train method does not have any problem with that. But tune method give me this error. So, I do not know what can I do, while I already added the path.

I do not use multiple nodes.