Stochastic Gradient Descent(SGD)
SGD (Stochastic Gradient Descent) is an optimization algorithm used in machine learning to minimize the loss function of a model by updating its parameters iteratively. It works by randomly selecting a small subset of the training data (a mini-batch) to compute the gradient of the loss function with respect to the model parameters, and then updating the parameters in the direction of the negative gradient.
- “Deep Learning” on Coursera
- Convolutional Neural Networks
- Stochastic Gradient Descent, Clearly Explained!!!
- Stochastic gradient descent explained | Stochastic gradient descent vs Gradient descent|Mini batch
- ML | Stochastic Gradient Descent (SGD)
- Stochastic Gradient Descent — Clearly Explained !!
- “Deep Learning” by Ian Goodfellow, Yoshua Bengio, and Aaron Courville