Self-concordant function explained

A self-concordant function is a function satisfying a certain differential inequality, which makes it particularly easy for optimization using Newton's method[1] A self-concordant barrier is a particular self-concordant function, that is also a barrier function for a particular convex set. Self-concordant barriers are important ingredients in interior point methods for optimization.

Self-concordant functions

Multivariate self-concordant function

Here is the general definition of a self-concordant function.[2]

Let C be a convex nonempty open set in Rn. Let f be a function that is three-times continuously differentiable defined on C. We say that f is self-concordant on C if it satisfies the following properties:

1. Barrier property: on any sequence of points in C that converges to a boundary point of C, f converges to ∞.

2. Differential inequality: for every point x in C, and any direction h in Rn, let gh be the function f restricted to the direction h, that is: gh(t) = f(x+t*h). Then the one-dimensional function gh should satisfy the following differential inequality:

|gh'''(x)|\leq2

3/2
g
h''(x)
.
Equivalently:[3]

\left.

d
d\alpha

\nabla2f(x+\alphay)\right|\alpha\preceq2\sqrt{yT\nabla2f(x)y}\nabla2f(x)

Univariate self-concordant function

f:RR

is self-concordant on

R

if:

|f'''(x)|\leq2f''(x)3/2

Equivalently: if wherever

f''(x)>0

it satisfies:

\left|

d
dx
1
\sqrt{f''(x)
} \right| \leq 1

and satisfies

f'''(x)=0

elsewhere.

Examples

f(x)=-log(-g(x))-logx

where

g(x)

is defined and convex for all

x>0

and verifies

|g'''(x)|\leq3g''(x)/x

, is self concordant on its domain which is

\{x\midx>0,g(x)<0\}

. Some examples are

g(x)=-xp

for

0<p\leq1

g(x)=-logx

g(x)=xp

for

-1\leqp\leq0

g(x)=(ax+b)2/x

g

satisfying the conditions, the function

g(x)+ax2+bx+c

with

a\geq0

also satisfies the conditions.

Some functions that are not self-concordant:

f(x)=ex

f(x)=

1
xp

,x>0,p>0

f(x)=|xp|,p>2

Self-concordant barriers

Here is the general definition of a self-concordant barrier (SCB).

Let C be a convex closed set in Rn with a non-empty interior. Let f be a function from interior(C) to R. Let M>0 be a real parameter. We say that f is a M-self-concordant barrier for C if it satisfies the following:

1. f is a self-concordant function on interior(C).

2. For every point x in interior(C), and any direction h in Rn, let gh be the function f restricted to the direction h, that is: gh(t) = f(x+t*h). Then the one-dimensional function gh should satisfy the following differential inequality:

|gh'(x)|\leqM1/2

1/2
g
h''(x)
.

Constructing SCBs

Due to the importance of SCBs in interior-point methods, it is important to know how to construct SCBs for various domains.

In theory, it can be proved that every closed convex domain in Rn has a self-concordant barrier with parameter O(n). But this “universal barrier” is given by some multivariate integrals, and it is too complicated for actual computations. Hence, the main goal is to construct SCBs that are efficiently computable.[4]

SCBs can be constructed from some basic SCBs, that are combined to produce SCBs for more complex domains, using several combination rules.

Basic SCBs

Every constant is a self-concordant barrier for all Rn, with parameter M=0. It is the only self-concordant barrier for the entire space, and the only self-concordant barrier with M < 1. [Note that linear and quadratic functions are self-concordant functions, but they are ''not'' self concordant barriers].

For the positive half-line

R+

(

x>0

),

f(x)=-lnx

is a self-concordant barrier with parameter

M=1

. This can be proved directly from the definition.

Substitution rule

Let G be a closed convex domain in Rn, and g an M-SCB for G. Let x = Ay+b be an affine mapping from Rk to Rn with its image intersecting the interior of G. Let H be the inverse image of G under the mapping: H = . Let h be the composite function h(y) := g(Ay+b). Then, h is an M-SCB for H.

For example, take n=1, G the positive half-line, and

g(x)=-lnx

. For any k, let a be a k-element vector and b a scalar. Let H = = a k-dimensional half-space. By the substitution rule,

h(y)=-ln(aTy+b)

is a 1-SCB for H. A more common format is H =, for which the SCB is

h(y)=-ln(b-aTy)

.

The substitution rule can be extended from affine mappings to a certain class of "appropriate" mappings, and to quadratic mappings.

Cartesian product rule

For all i in 1,...,m, let Gi be a closed convex domains in Rni, and let gi be an Mi-SCB for Gi. Let G be the cartesian product of all Gi. Let g(x1,...,xm) := sumi gi(xi). Then, g is a SCB for G, with parameter sumi Mi.

For example, take all Gi to be the positive half-line, so that G is the positive orthant

m
R
+
. Let

g(x)=

m
-\sum
i=1

lnxi

is an m-SCB for G.

We can now apply the substitution rule. We get that, for the polytope defined by the linear inequalities ajTxbj for j in 1,...,m, if it satisfies Slater's condition, then

f(x)=

m
-\sum
i=1

ln(bj-a

T
j

x)

is an m-SCB. The linear functions

bj-a

T
j

x

can be replaced by quadratic functions.

Intersection rule

Let G1,...,Gm be closed convex domains in Rn. For each i in 1,...,m, let gi be an Mi-SCB for Gi, and ri a real number. Let G be the intersection of all Gi, and suppose its interior is nonempty. Let g := sumi ri*gi. Then, g is a SCB for G, with parameter sumi ri*Mi.

Therefore, if G is defined by a list of constraints, we can find a SCB for each constraint separately, and then simply sum them to get a SCB for G.

For example, suppose the domain is defined by m linear constraints of the form ajTxbj, for j in 1,...,m. Then we can use the Intersection rule to construct the m-SCB

f(x)=

m
-\sum
i=1

ln(bj-a

T
j

x)

(the same one that we previously computed using the Cartesian product rule).

SCBs for epigraphs

The epigraph of a function f(x) is the area above the graph of the function, that is,

\{(x,t)\inR2:t\geqf(x)\}

. The epigraph of f is a convex set if and only if f is a convex function. The following theorems present some functions f for which the epigraph has an SCB.

Let g(t) be a 3-times continuously-differentiable concave function on t>0, such that

t|g'''(t)|/|g''(t)|

is bounded by a constant (denoted 3*b) for all t>0. Let G be the 2-dimensional convex domain:

G=closure(\{(x,t)\inR2:t>0,x\leqg(t)\}).

Then, the function f(x,t) = -ln(f(t)-x) - max[1,b<sup>2</sup>]*ln(t) is a self-concordant barrier for G, with parameter (1+max[1,b<sup>2</sup>]).

Examples:

G1=\{(x,t)\inR2:

p
(x
+)

\leqt\}

has a 2-SCB. Similarly,

G2=\{(x,t)\inR2:

p
([-x]
+)

\leqt\}

has a 2-SCB. Using the Intersection rule, we get that

G=G1\capG2=\{(x,t)\inR2:|x|p\leqt\}

has a 4-SCB.

G=\{(x,t)\inR2:ex\leqt\}

has a 2-SCB.We can now construct a SCB for the problem of minimizing the p-norm:

minx

n
\sum
j=1

|vj-xT

p
u
j|
, where vj are constant scalars, uj are constant vectors, and p>0 is a constant. We first convert it into minimization of a linear objective:

minx

n
\sum
j=1

tj

, with the constraints:

tj\geq|vj-xT

p
u
j|

for all j in [''m'']. For each constraint, we have a 4-SCB by the affine substitution rule. Using the Intersection rule, we get a (4n)-SCB for the entire feasible domain.

Similarly, let g be a 3-times continuously-differentiable convex function on the ray x>0, such that:

x|g'''(x)|/|g''(x)|\leq3b

for all x>0. Let G be the 2-dimensional convex domain: closure. Then, the function f(x,t) = -ln(t-f(x)) - max[1,b<sup>2</sup>]*ln(x) is a self-concordant barrier for G, with parameter (1+max[1,b<sup>2</sup>]).

Examples:

G1=\{(x,t)\inR2:x-p\leqt,x\geq0\}

has a 2-SCB.

G=\{(x,t)\inR2:xlnx\leqt,x\geq0\}

has a 2-SCB.

SCBs for cones

\{(x,y)\inRn-1 x R\mid\|x\|\leqy\}

, the function

f(x,y)=-log(y2-xTx)

is a self-concordant barrier.

f(A)=-log\detA

is a self-concordant barrier.

\phi(x)>0

where

\phi(x)=\alpha+\langlea,x\rangle-

1
2

\langleAx,x\rangle

where

A=AT\geq0

is a positive semi-definite symmetric matrix, the logarithmic barrier

f(x)=-log\phi(x)

is self-concordant with

M=2

\{(x,y,z)\inR3\midyex/y\leqz,y>0\}

, the function

f(x,y,z)=-log(ylog(z/y)-x)-logz-logy

is a self-concordant barrier.

\{(x1,x2,y)\in

2
R
+

x R\mid|y|\leq

\alpha
x
1
1-\alpha
x
2

\}

, the function

f(x1,x2,y)=

2\alpha
-log(x
1
2(1-\alpha)
x
2

-y2)-logx1-logx2

is a self-concordant barrier.

History

As mentioned in the "Bibliography Comments"[5] of their 1994 book,[6] self-concordant functions were introduced in 1988 by Yurii Nesterov[7] [8] and further developed with Arkadi Nemirovski.[9] As explained in[10] their basic observation was that the Newton method is affine invariant, in the sense that if for a function

f(x)

we have Newton steps

xk+1=xk-

-1
[f''(x
k)]

f'(xk)

then for a function

\phi(y)=f(Ay)

where

A

is a non-degenerate linear transformation, starting from

y0=A-1x0

we have the Newton steps

yk=A-1xk

which can be shown recursively

yk+1=yk-

-1
[\phi''(y
k)]

\phi'(yk)=yk-[ATf''(Ayk)A]-1ATf'(Ayk)=A-1xk-A-1

-1
[f''(x
k)]

f'(xk)=A-1xk+1

.

However, the standard analysis of the Newton method supposes that the Hessian of

f

is Lipschitz continuous, that is

\|f''(x)-f''(y)\|\leqM\|x-y\|

for some constant

M

. If we suppose that

f

is 3 times continuously differentiable, then this is equivalent to

|\langlef'''(x)[u]v,v\rangle|\leqM\|u\|\|v\|2

for all

u,v\inRn

where

f'''(x)[u]=\lim\alpha\alpha-1[f''(x+\alphau)-f''(x)]

. Then the left hand side of the above inequality is invariant under the affine transformation

f(x)\to\phi(y)=f(Ay),u\toA-1u,v\toA-1v

, however the right hand side is not.

The authors note that the right hand side can be made also invariant if we replace the Euclidean metric by the scalar product defined by the Hessian of

f

defined as

\|w\|f''(x)=\langlef''(x)w,w\rangle1/2

for

w\inRn

. They then arrive at the definition of a self concordant function as

|\langlef'''(x)[u]u,u\rangle|\leqM\langlef''(x)u,u\rangle3/2

.

Properties

Linear combination

If

f1

and

f2

are self-concordant with constants

M1

and

M2

and

\alpha,\beta>0

, then

\alphaf1+\betaf2

is self-concordant with constant

max(\alpha-1/2M1,\beta-1/2M2)

.

Affine transformation

If

f

is self-concordant with constant

M

and

Ax+b

is an affine transformation of

Rn

, then

\phi(x)=f(Ax+b)

is also self-concordant with parameter

M

.

Convex conjugate

If

f

is self-concordant, then its convex conjugate

f*

is also self-concordant.[11] [12]

Non-singular Hessian

If

f

is self-concordant and the domain of

f

contains no straight line (infinite in both directions), then

f''

is non-singular.

Conversely, if for some

x

in the domain of

f

and

u\inRn,u0

we have

\langlef''(x)u,u\rangle=0

, then

\langlef''(x+\alphau)u,u\rangle=0

for all

\alpha

for which

x+\alphau

is in the domain of

f

and then

f(x+\alphau)

is linear and cannot have a maximum so all of

x+\alphau,\alpha\inR

is in the domain of

f

. We note also that

f

cannot have a minimum inside its domain.

Applications

Among other things, self-concordant functions are useful in the analysis of Newton's method. Self-concordant barrier functions are used to develop the barrier functions used in interior point methods for convex and nonlinear optimization. The usual analysis of the Newton method would not work for barrier functions as their second derivative cannot be Lipschitz continuous, otherwise they would be bounded on any compact subset of

Rn

.

Self-concordant barrier functions

Minimizing a self-concordant function

A self-concordant function may be minimized with a modified Newton method where we have a bound on the number of steps required for convergence. We suppose here that

f

is a standard self-concordant function, that is it is self-concordant with parameter

M=2

.

We define the Newton decrement

λf(x)

of

f

at

x

as the size of the Newton step

[f''(x)]-1f'(x)

in the local norm defined by the Hessian of

f

at

x

λf(x)=\langlef''(x)[f''(x)]-1f'(x),[f''(x)]-1f'(x)\rangle1/2=\langle[f''(x)]-1f'(x),f'(x)\rangle1/2

Then for

x

in the domain of

f

, if

λf(x)<1

then it is possible to prove that the Newton iterate

x+=x-[f''(x)]-1f'(x)

will be also in the domain of

f

. This is because, based on the self-concordance of

f

, it is possible to give some finite bounds on the value of

f(x+)

. We further have

λf(x+)\leq(

λf(x)
1-λf(x)

)2

Then if we have

λf(x)<\barλ=

3-\sqrt5
2
then it is also guaranteed that

λf(x+)<λf(x)

, so that we can continue to use the Newton method until convergence. Note that for

λf(x+)<\beta

for some

\beta\in(0,\barλ)

we have quadratic convergence of

λf

to 0 as

λf(x+)\leq(1-\beta)-2

2
λ
f(x)
. This then gives quadratic convergence of

f(xk)

to

f(x*)

and of

x

to

x*

, where

x*=\argminf(x)

, by the following theorem. If

λf(x)<1

then

\omega(λf(x))\leqf(x)-f(x*)\leq\omega*(λf(x))

\omega'(λf(x))\leq\|x-x*\|x\leq\omega*'(λf(x))

with the following definitions

\omega(t)=t-log(1+t)

\omega*(t)=-t-log(1-t)

\|u\|x=\langlef''(x)u,u\rangle1/2

If we start the Newton method from some

x0

with

λf(x0)\geq\barλ

then we have to start by using a damped Newton method defined by

xk+1=xk-

1
1+λf(xk)
-1
[f''(x
k)]

f'(xk)

For this it can be shown that

f(xk+1)\leqf(xk)-\omega(λf(xk))

with

\omega

as defined previously. Note that

\omega(t)

is an increasing function for

t>0

so that

\omega(t)\geq\omega(\barλ)

for any

t\geq\barλ

, so the value of

f

is guaranteed to decrease by a certain amount in each iteration, which also proves that

xk+1

is in the domain of

f

.

Notes and References

  1. Web site: Nemirovsky and Ben-Tal . 2023 . Optimization III: Convex Optimization .
  2. Web site: Arkadi Nemirovsky . 2004 . Interior point polynomial time methods in convex programming .
  3. Book: Boyd . Stephen P. . Convex Optimization . Vandenberghe . Lieven . Cambridge University Press . 2004 . 978-0-521-83378-3 . October 15, 2011.
  4. Web site: Nemirovsky and Ben-Tal . 2023 . Optimization III: Convex Optimization .
  5. Book: Nesterov. Yurii. Interior-Point Polynomial Algorithms in Convex Programming (Bibliography Comments). Nemirovskii. Arkadii. January 1994. Society for Industrial and Applied Mathematics. 978-0-89871-319-0. en. 10.1137/1.9781611970791.bm.
  6. Book: Nesterov, I︠U︡. E.. Interior-point polynomial algorithms in convex programming. 1994. Society for Industrial and Applied Mathematics. Nemirovskiĭ, Arkadiĭ Semenovich.. 0-89871-319-6. Philadelphia. 29310677.
  7. Yu. E. NESTEROV, Polynomial time methods in linear and quadratic programming, Izvestija AN SSSR, Tekhnitcheskaya kibernetika, No. 3, 1988, pp. 324-326. (In Russian.)
  8. Yu. E. NESTEROV, Polynomial time iterative methods in linear and quadratic programming, Voprosy kibernetiki, Moscow,1988, pp. 102-125. (In Russian.)
  9. Y.E. Nesterov and A.S. Nemirovski, Self–concordant functions and polynomial–time methods in convex programming, Technical report, Central Economic and Mathematical Institute, USSR Academy of Science, Moscow, USSR, 1989.
  10. Book: Nesterov, I︠U︡. E.. Introductory lectures on convex optimization : a basic course. 978-1-4419-8853-9. Boston. 883391994.
  11. Nesterov . Yurii . Nemirovskii . Arkadii . nterior-Point Polynomial Algorithms in Convex Programming . Studies in Applied and Numerical Mathematics . 1994 . 10.1137/1.9781611970791 . 978-0-89871-319-0 .
  12. Sun . Tianxiao . Tran-Dinh . Quoc . Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods . Mathematical Programming . 2018 . Proposition 6 . 1703.04599 .