Ray
Failed to register a custom
RLlib
mannyv
March 15, 2023, 11:23am
4
Hi
@jiangzhangze
,
This was a bug. It should be fixed in ray 2.3 and nightly.
2 Likes
show post in topic
Related topics
Topic
Replies
Views
Activity
[Bug] Env must be one of the supported types: BaseEnv, gym.Env, MultiAgentEnv, VectorEnv, RemoteBaseEnv
RLlib
10
2255
March 2, 2023
Register a custom environment and runing PPOTrainer on that environment not working
RLlib
7
2869
September 24, 2023
Unable to import custom gym environment having multiple parameters
RLlib
6
64
November 28, 2024
Can I use RLlib with a custom environment without registering it?
RLlib
1
581
February 1, 2023
Issue regarding custom gym environment with Ray Rlib
RLlib
1
544
October 23, 2023