mini-batch gradient descent
Mini-batch gradient descent is a popular optimization algorithm used in machine learning, where instead of updating the model's parameters using the gradients calculated on the entire dataset (batch), the gradients are computed on smaller subsets of the data called mini-batches, resulting in faster and more computationally efficient training.
Requires login.
Related Concepts (2)
Similar Concepts
- accelerated gradient descent methods
- batch gradient descent
- conjugate gradient descent
- constraints in gradient descent optimization
- convergence of gradient descent
- empirical risk minimization in gradient descent
- gradient descent for linear regression
- gradient descent for neural networks
- hybrid optimization algorithms combining gradient descent
- non-convex optimization using gradient descent
- online gradient descent
- parallel and distributed gradient descent
- proximal gradient descent
- stochastic gradient descent
- variants of gradient descent algorithms