How to restore part of weights from trained model into another model before training?

Hello all! As the title mentions, I’m essentially trying to do some transfer learning of model weights from one trained model to another. However, the catch is that the models have differing input/output layers, so I’m only trying to transfer weights of the layers in between. Therefore I haven’t been able to use the regular “restore” option to restore from a checkpoint. What I would also like to do is start everything else in this new training as if it were a brand-new training, except with the weights of specific layers of the model initialized to the pre-trained weights that I’ve transferred over. What would be the best way to accomplish this?

I’ve been playing around with two ideas so far, but I’m not sure if I’m on the right track. The main thing I’ve been doing is making a new Trainer class that’s inheriting from PPOTrainer, and then overriding the _restore function to splice out the weights I don’t want transferred, but when I enable the “restore” option with a checkpoint from my trained model, debugging seems to ignore this function entirely (i.e. if I set a breakpoint in the function it gets ignored). I’ve also tried instead overriding load_checkpoint and attempting policy modifications that way, and that does seem to work. Would this be the approach to go here?