RLlib integration with MLagent

How severe does this issue affect your experience of using Ray?

  • High: It blocks me to complete my task.

I am trying to train agents in a unity 3d environment. I tried to gather the obs_space and action_space for this environment.


This is the code that I implemented in python.

Unfortunately, this does not work and produces error message (Error: ‘tuple’ object has no attribute 'dtype’).

May I ask if anyone knows how to debug this to run training from RLlib?