Answer Posted / Ankita Dubey
Dropout is a regularization technique in neural networks that helps prevent overfitting by randomly disabling a percentage of neurons during training, thereby forcing the network to learn more robust features. On the other hand, Batch Normalization normalizes the activations of each layer input across the mini-batch during training, which makes the learning process more stable and faster. It also allows for higher learning rates and can help improve generalization.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category