August 2018
Intermediate to advanced
272 pages
7h 2m
English
We might be inclined to now think that setting all our weights to zero will achieve maximum symmetry. However, this is actually a very bad idea, and our model will never learn anything. This is because when you do a forward pass, every neuron will produce the same result; so, during the backpropagation step, all the weights will update in the same way. This means the model can never learn an informative set of features, so don’t initialize like this.