I’d like to use a sparse matrix as a recurrent state. These are organised as a tensor of indices and a tensor of values. This greatly improves memory efficiency and computational efficiency (not multiplying a bunch of zeros), but the tensors change size as new entries are added to the matrix. It seems like `rllib/numpy`

does not like serialising arrays of different shapes into rollouts.

I was wondering if anyone is aware of a way to use dynamically-shaped recurrent states using `TorchModelv2`

.