I want to run PPO example on GPU. Here is my code for training.
from ray.rllib.agents.ppo import PPOTrainer
import ray
ray.init()
trainer = PPOTrainer(config={"env": "CartPole-v0",
"framework": "torch",
"num_gpus": 1,
"num_workers": 4})
trainer.train()
However, It returns errors like
/ray/rllib/policy/torch_policy.py", line 155, in __init__
self.device = self.devices[0]
IndexError: list index out of range
I install Pytorch with GPU support. and torach.cuda_is_available() == True
.
The error seems like that ray could not find the GPU devices even though Pytorch can.
Is there any solution to this?
Thanks!