'use_lstm' wrapping in older and newer Ray versions

How severe does this issue affect your experience of using Ray?

  • High: It blocks me to complete my task.

Context: Trying to find out if use_lstm=True works the same way in Ray 1.1 and Ray 1.11 for custom models.

I want to see if use_lstm=True wraps an LSTM over my custom model in an old Ray version (1.1).

To be more precise, I want to see if there is an LSTM added to this FCNet or not:

from ray import tune
from ray.rllib.models.torch.fcnet import FullyConnectedNetwork
from ray.rllib.models.catalog import ModelCatalog

ModelCatalog.register_custom_model("FCNet", FullyConnectedNetwork)

tune.run("PPO",
         config={
            "env": "CartPole-v0",
            "model": {
                "use_lstm": True,
                "custom_model": "FCNet",
                "custom_model_config": {},
            },
        "framework": "torch"
    })

I expected to find this LSTM wrapping inside class ray/recurrent_net.py at releases/1.1.0 · ray-project/ray · GitHub but not debugging nor printing or asserting code works there while training, like it’s not being used at all.

Removing the custom_model tags from the config, shows the expected result for "use_lstm": True: LSTMWrapper from recurrent_net.py is being used to wrap the default model:

from ray import tune
# from ray.rllib.models.torch.fcnet import FullyConnectedNetwork
# from ray.rllib.models.catalog import ModelCatalog

# ModelCatalog.register_custom_model("FCNet", FullyConnectedNetwork)

tune.run("PPO",
         config={
            "env": "CartPole-v0",
            "model": {
                "use_lstm": True,
                # "custom_model": "FCNet",
                # "custom_model_config": {},
            },
        "framework": "torch"
    })

So I suspect there is no LSTM wrapping being done on top of the FullyConnectedNetwork, because it is a custom model.

Is this correct? Because I think newer versions of Ray (like 1.11) wrap an LSTM to any model, custom or not, while using "use_lstm": True

Edit: Proably the magic is happening here ray/catalog.py at c8f8a8e51005cb3aab64a9e98f6e709b4976c03b · ray-project/ray · GitHub where it receives model_interface==None for a custom model, but model_interface==<class 'ray.rllib.models.torch.recurrent_net.LSTMWrapper'> for a default model.