Custom environment registration error

Hi everyone,
I am here to ask for how to register a custom env. I think I am pretty much following the official document, but having troubles.
I am not sure what I did wrong to register a custom environment.
Some suggested that I could use Ray 2.3.0 version, but it is still same.

Here is the code:

from ray.tune.registry import register_env
import gymnasium as gym
from gymnasium.spaces import Discrete, Box
from ray.rllib.env.env_context import EnvContext
from ray.tune.registry import register_env
from ray.rllib.utils import check_env

# Test env
class SimpleCorridor(gym.Env):
"""Example of a custom env in which you have to walk down a corridor.
You can configure the length of the corridor via the env config."""

    def __init__(self, config: EnvContext):
        self.end_pos = 10
        self.cur_pos = 0
        self.action_space = Discrete(2)
        self.observation_space = Box(0.0, self.end_pos, shape=(1,), dtype=np.float32)
        # Set the seed. This is only used for the final (reach goal) reward.
        if isinstance(config, EnvContext):
            self.reset(seed=config["seed"] + config.worker_index + config.num_workers)
        else:
            self.reset(seed=config["seed"])

    def reset(self, *, seed=None, options=None):
        # random.seed(seed)
        self.cur_pos = 0
        return [self.cur_pos], {}

    def step(self, action):
        if action == 0 and self.cur_pos > 0:
            self.cur_pos -= 1
        elif action == 1:
            self.cur_pos += 1
        done = truncated = self.cur_pos >= self.end_pos
        # Produce a random reward when we reach the goal.
        return (
            [self.cur_pos],
            1,
            done,
            truncated,
            {},
        )

# Register the custom environment class
def env_creator(env_config):
    return SimpleCorridor(env_config)

register_env('corridor', lambda config: env_creator(config))
check_env('corridor')

This code makes an error below:

/usr/local/lib/python3.10/dist-packages/ray/rllib/utils/pre_checks/env.py in check_env(env)
     90     except Exception:
     91         actual_error = traceback.format_exc()
---> 92         raise ValueError(
     93             f"{actual_error}\n"
     94             "The above error has been found in your environment! "

ValueError: Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/ray/rllib/utils/pre_checks/env.py", line 69, in check_env
raise ValueError(ValueError: Env must be of one of the following supported types: BaseEnv, gymnasium.Env, gym.Env, MultiAgentEnv, VectorEnv, RemoteBaseEnv, ExternalMultiAgentEnv, ExternalEnv, but instead is of type <class 'str'>.

The above error has been found in your environment! We've added a module for checking your custom environments. It may cause your experiment to fail if your environment is not set up correctly. You can disable this behavior via calling 
`config.environment(disable_env_checking=True)`. You can run the environment checking module standalone by calling ray.rllib.utils.check_env([your env]).

Hi @Yong,

The check_env does not accept a string. You have to provide a fully instantiated object. If the environment works then the registered string will work with AIR.