Answer Posted / Dushyant Prakash
Vanishing gradients refer to the issue in deep neural networks where the gradient becomes too small during backpropagation, making it difficult for the network to learn from errors made on early layers. This can lead to slow learning or convergence issues. Exploding gradients occur when the gradient explodes and grows exponentially large during backpropagation, causing numerical instability and often leading to incorrect results.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category