Sunday 7 December 2014

Gradient Ascent v.s. Stochastic Gradient Ascent

Optimization algorithms is to find the best-fit parameters. The best-fit stuff is where the name regression comes from.


Gradient Ascent: is based on the idea that if we want to find the maximum point on a function, the the best way to move is in the direction of the gradient.
Use the whole dataset on each update.

Stochastic Gradient Ascent: is an online learning algorithm.
We can incrementally update the classifier as new data comes in rather than all at once.
It can use far fewer computing resources.

No comments:

Post a Comment