In mathematics, variational analysis is the combination and extension of methods from convex optimization and the classical calculus of variations to a more general theory. This includes the more general problems of optimization theory, including topics in set-valued analysis, e.g. generalized derivatives.
In the Mathematics Subject Classification scheme (MSC2010), the field of "Set-valued and variational analysis" is coded by "49J53".[1]
While this area of mathematics has a long history, the first use of the term "Variational analysis" in this sense was in an eponymous book by R. Tyrrell Rockafellar and Roger J-B Wets.
A classical result is that a lower semicontinuous function on a compact set attains its minimum. Results from variational analysis such as Ekeland's variational principle allow us to extend this result of lower semicontinuous functions on non-compact sets provided that the function has a lower bound and at the cost of adding a small perturbation to the function. A smooth variant is known as the Borwein-Press variational principle.[2]
The classical Fermat's theorem says that if a differentiable function attains its minimum at a point, and that point is an interior point of its domain, then its derivative must be zero at that point. For problems where a smooth function must be minimized subject to constraints which can be expressed in the form of other smooth functions being equal to zero, the method of Lagrange multipliers, another classical result, gives necessary conditions in terms of the derivatives of the function.
The ideas of these classical results can be extended to nondifferentiable convex functions by generalizing the notion of derivative to that of subderivative. Further generalization of the notion of the derivative such as the Clarke generalized gradient allow the results to be extended to nonsmooth locally Lipschitz functions.[3]