Pytorch Export problems

How severe does this issue affect your experience of using Ray?

  • High: It blocks me to complete my task.

I’m having trouble exporting a trained pytorch model with Rllib 2.6.3, python 3.9.19

The script I’m using is:

if name == “main”:
ray.init()

# Register Env
register_env("HLEnv", lambda env_config: HLInfoInspEnv(env_config))

checkpoint_path = "./Policies/NO_LSTM_expr_20240404_100511/DQN_HLInfoInspEnv_1de3c_00000_0_2024-04-04_10-05-12/checkpoint_000002/policies/policy_0"
policy_0 = Policy.from_checkpoint(checkpoint_path)


# Extract policy
print("model summary")
trch_model = policy_0.model
print(trch_model)

#print("tf_model state dict")
#print(tf_model.state_dict())

print("saving model")
#policy_0.export_model(export_dir='./onnx_policy_export', onnx=1)
torch.save(policy_0.model, './tf_policy_export/my_tf_policy.pt')

But I’m getting an error message like:
Traceback (most recent call last):
File “C:\Users\henry.lei\Documents\Projects\Inspect-A-M4-RL\HL-Inspection-Env-Old\extract_model.py”, line 44, in
torch.save(policy_0.model, ‘./tf_policy_export/my_tf_policy.pt’)
File “C:\Users\henry.lei\Miniconda3\envs\m4-insp-ray-2.6.3\lib\site-packages\torch\serialization.py”, line 629, in save
_save(obj, opened_zipfile, pickle_module, pickle_protocol, _disable_byteorder_record)
File “C:\Users\henry.lei\Miniconda3\envs\m4-insp-ray-2.6.3\lib\site-packages\torch\serialization.py”, line 841, in _save
pickler.dump(obj)
_pickle.PicklingError: Can’t pickle <class ‘ray.rllib.models.catalog.FullyConnectedNetwork_as_DQNTorchModel’>: attribute lookup FullyConnectedNetwork_as_DQNTorchModel on ray.rllib.models.catalog failed

Any ideas?