Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...

Explain the following variant of gradient descent: stochastic, batch, and mini-batch?

Answer Posted / Neelabh Shukla

1. Stochastic Gradient Descent (SGD) is a variation of Gradient Descent where instead of computing gradients over an entire dataset, it computes gradients for a single example from the dataset at each iteration. This makes SGD computationally efficient but prone to noisy updates due to its high variance. 2. Batch Gradient Descent (BGD) is another variation where gradients are computed over the entire training set in one go. This helps reduce the noise compared to SGD, but it can be slower and require more memory. 3. Mini-Batch Gradient Descent (MBGD) is a compromise between SGD and BGD. In MBGD, gradients are computed for a small batch of examples (mini-batch) instead of the entire dataset or a single example. This reduces the noise while still maintaining computational efficiency.

Is This Answer Correct ?    0 Yes 0 No



Post New Answer       View All Answers


Please Help Members By Posting Answers For Below Questions

No New Questions to Answer in this Category !!    You can

Post New Questions

Answer Questions in Different Category