Partial freeze and partial train

Hi @yiwc,

If you are using torch, you could write a function to freeze the layers of interest by setting requires_grad=False on those layers parameters.

If you are using tf and keras you can set the layer.trainable=False

With either framework, could then create a new trainer that apply this function similar to the method described in this post: