Why is zero initialization not a good weight initialization process?
Answer Posted / Nitin Verma
Zero initialization of weights in neural networks can lead to slow convergence or even failure to converge because the network becomes stuck at local minima. This is due to the symmetry in the gradient during backpropagation, which makes it difficult for the learning rate to change significantly during training. In contrast, methods like Xavier initialization and He initialization randomize the weights to break this symmetry and speed up learning.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category