Great question @raoul-khour-ts ! There is no good reason to have these defaults for tf and torch be different. We should fix this. …
@sven1977 If there is a reason to chage initializer maybe the better option will be changing to Xavier Glorot initializer as default ?
Also a good point (for CNNs, that’s what we already use). The thing is, I don’t want to change the default w/o having at least run some benchmarks again.
I did change torch’s vf layer to 0.01 (normc), just like tf’s fcnet works. The risk of that should be relatively low: