Answer Posted / Gulshan Jahan
Gradient Descent is an optimization algorithm used in machine learning to find the minimum value of a function by iteratively moving in the direction of steepest descent. It works by adjusting the parameters of a model such that the cost function or error is minimized. The update rule for Gradient Descent is: w = w - α * gradient(w). Here, w represents the weights or parameters of the model, α is the learning rate which determines the step size, and gradient(w) is the derivative of the cost function with respect to the weights.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category