RLlib integration with MLagent

How severe does this issue affect your experience of using Ray?

  • High: It blocks me to complete my task.

I am trying to train agents in a unity 3d environment. I tried to gather the obs_space and action_space for this environment.


This is the code that I implemented in python.

Unfortunately, this does not work and produces error message (Error: ‘tuple’ object has no attribute 'dtype’).

May I ask if anyone knows how to debug this to run training from RLlib?

Can you paste a complete reproduction script?
Obviously a tuple space has no dtype, but where does the error occur?

When debugging Ray RLlib, the first starting point is usually to call ray.init(local_mode=True) and then set a breakpoint where the error occurs to inspect the variables around that location and see what’s wrong.