mini-batch gradient descent

Mini-batch gradient descent is a popular optimization algorithm used in machine learning, where instead of updating the model's parameters using the gradients calculated on the entire dataset (batch), the gradients are computed on smaller subsets of the data called mini-batches, resulting in faster and more computationally efficient training.

Requires login.