Answer Posted / Kalpna Rani
Adagrad (Adaptive Gradient Algorithm) is an optimization algorithm used to update the weights in a neural network during training. It adjusts the learning rate for each parameter based on its historical contribution to the squared error, making it more efficient when dealing with sparse data and reducing the risk of overfitting.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers