αΒΒ is a second-order deterministic global optimization algorithm for finding the optima of general, twice continuously differentiable functions.[1] [2] The algorithm is based around creating a relaxation for nonlinear functions of general form by superposing them with a quadratic of sufficient magnitude, called α, such that the resulting superposition is enough to overcome the worst-case scenario of non-convexity of the original function. Since a quadratic has a diagonal Hessian matrix, this superposition essentially adds a number to all diagonal elements of the original Hessian, such that the resulting Hessian is positive-semidefinite. Thus, the resulting relaxation is a convex function.
Let a function
{f(\boldsymbol{x})\inC2}
X=\{\boldsymbol{x}\inRn:\boldsymbol{x}L\leq\boldsymbol{x}\leq\boldsymbol{x}U\}
L(\boldsymbol{x})
X
{f(\boldsymbol{x})}
X
i=n | |
L(\boldsymbol{x})=f(\boldsymbol{x})+\sum | |
i=1 |
\alphai(x
L | |
i |
-xi)(x
U | |
i |
-xi)
L(\boldsymbol{x})
\alphaBB
\alphai
L(\boldsymbol{x})
X
L(\boldsymbol{x})
{f(\boldsymbol{x})}
\boldsymbol{\alpha}
There are numerous methods to calculate the values of the
\boldsymbol{\alpha}
\alpha | ||||
|
min | |
λ | |
i |
\}
min | |
λ | |
i |
i
{f(\boldsymbol{x})}
L(\boldsymbol{x})
One of the most popular methods to get these valid bounds on eigenvalues is by use of the Scaled Gerschgorin theorem. Let
H(X)
{f(X)}
X
\foralldi>0
λi
i
H(X)
min | |
λ | |
i |
=\underline{hii