1. Severity of the issue: (select one)
None: I’m just curious or want clarification.
2. Environment:
- Ray version: 2.44.0
- Python version: 3.12
- OS: Ubuntu
- Cloud/Infrastructure: NULL
- Other libs/tools (if relevant): NULL
3. What happened vs. what you expected:
I found that those attention-related configurations are removed from the DefaultModelConfig
. So I am wondering whether it would be nice if an example of using the attention mechanism in RLlib could be provided.
Many thanks for considering my request.