Towards Data Science AI

The Machine Learning “Advent Calendar” Bonus 2: Gradient Descent Variants in Excel

Back to overview

Gradient descent variants—including Momentum, RMSProp, and Adam—all converge toward the same minimum but traverse different paths. Each method addresses limitations of its predecessor, enhancing convergence speed, stability, or adaptivity without changing the destination. The optimization goal remains constant; only the update mechanism becomes more sophisticated.