How the pytorch freeze network in some layers, only the rest of the training?
Thanks for the clarification! As a follow-up to my point regarding the speed up - I am not observing a speedup when I freeze the initial 80% of the network. I expected the training to be faster, since it only has to update 20% and to be lighter, since it o
discuss.pytorch.org