I was browsing through the examples in
unity3d_env_local.py and noticed that the GridWorld environment (which has optional discrete action masking) is missing. Is it possible to use the ML-Agents action mask and pass it to the RLlib Unity3DEnv class?
If so, are there any examples of this available? I would like to try it on my own project to try and use the Ray Tune hyperparameter tuning, but I am using action masking.