Skip to main content



Gradient Descent Demystified

Gradient Descent
Gradient descent is an optimization algorithm that helps minimize the cost by finding the optimal parameters for given set of inputs.  Parameters are coefficients (b,m) in linear regression or weights (w) in neural networks. The goal is to minimize the cost when training.  ErrorThe error can be defined as the following:

Latest posts