In optimization, a gradient method is an algorithm to solve problems of the form
min | |
x\inRn |
f(x)
with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.