What is the difference between Batch Gradient Descent and Stochastic Gradient Descent?
Answer Posted / Shiv Shkti Singh
Batch Gradient Descent computes the gradient of the loss function using the entire training dataset at each iteration, while Stochastic Gradient Descent uses a single example from the training dataset at each iteration. SGD typically requires fewer iterations and is less computationally expensive than Batch GD, but it may converge to suboptimal solutions if the learning rate is too large.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category