Baker–Campbell–Hausdorff formula explained

In mathematics, the Baker–Campbell–Hausdorff formula gives the value of

Z

that solves the equatione^X e^Y = e^Zfor possibly noncommutative and in the Lie algebra of a Lie group. There are various ways of writing the formula, but all ultimately yield an expression for

Z

in Lie algebraic terms, that is, as a formal series (not necessarily convergent) in

X

and

Y

and iterated commutators thereof. The first few terms of this series are:Z = X + Y + \frac [X,Y] + \frac [X,[X,Y]] - \frac [Y,[X,Y]] + \cdots\,,where "

" indicates terms involving higher commutators of

X

and

Y

. If

X

and

Y

are sufficiently small elements of the Lie algebra

akg

of a Lie group

G

, the series is convergent. Meanwhile, every element

g

sufficiently close to the identity in

G

can be expressed as

g=eX

for a small

X

in

akg

. Thus, we can say that near the identity the group multiplication in

G

—written as

eXeY=eZ

—can be expressed in purely Lie algebraic terms. The Baker–Campbell–Hausdorff formula can be used to give comparatively simple proofs of deep results in the Lie group–Lie algebra correspondence.

If

X

and

Y

are sufficiently small

n x n

matrices, then

Z

can be computed as the logarithm of

eXeY

, where the exponentials and the logarithm can be computed as power series. The point of the Baker–Campbell–Hausdorff formula is then the highly nonobvious claim that

Z:=log\left(eXeY\right)

can be expressed as a series in repeated commutators of

X

and

Y

.

Modern expositions of the formula can be found in, among other places, the books of Rossmann and Hall.

History

The formula is named after Henry Frederick Baker, John Edward Campbell, and Felix Hausdorff who stated its qualitative form, i.e. that only commutators and commutators of commutators, ad infinitum, are needed to express the solution. An earlier statement of the form was adumbrated by Friedrich Schur in 1890[1] where a convergent power series is given, with terms recursively defined.[2] This qualitative form is what is used in the most important applications, such as the relatively accessible proofs of the Lie correspondence and in quantum field theory. Following Schur, it was noted in print by Campbell[3] (1897); elaborated by Henri Poincaré[4] (1899) and Baker (1902);[5] and systematized geometrically, and linked to the Jacobi identity by Hausdorff (1906).[6] The first actual explicit formula, with all numerical coefficients, is due to Eugene Dynkin (1947).[7] The history of the formula is described in detail in the article of Achilles and Bonfiglioli and in the book of Bonfiglioli and Fulci.

Explicit forms

For many purposes, it is only necessary to know that an expansion for

Z

in terms of iterated commutators of

X

and

Y

exists; the exact coefficients are often irrelevant. (See, for example, the discussion of the relationship between Lie group and Lie algebra homomorphisms in Section 5.2 of Hall's book, where the precise coefficients play no role in the argument.) A remarkably direct existence proof was given by Martin Eichler,[8] see also the "Existence results" section below.

In other cases, one may need detailed information about

Z

and it is therefore desirable to compute

Z

as explicitly as possible. Numerous formulas exist; we will describe two of the main ones (Dynkin's formula and the integral formula of Poincaré) in this section.

Dynkin's formula

Let G be a Lie group with Lie algebra

akg

. Let\exp : \mathfrak g \to G be the exponential map.The following general combinatorial formula was introduced by Eugene Dynkin (1947),[9] \log(\exp X\exp Y) =\sum_^\infty\frac \sum_\frac,where the sum is performed over all nonnegative values of

si

and

ri

, and the following notation has been used: [X^{r_1} Y^{s_1} \dotsm X^{r_n} Y^{s_n} ] = [\underbrace{X,[X,\dotsm[X}_{r_1},[ \underbrace{Y,[Y,\dotsm[Y}_{s_1},\,\dotsm\, [ \underbrace{X,[X,\dotsm[X}_{r_n},[ \underbrace{Y,[Y,\dotsm Y}_{s_n} ]]\dotsm]]with the understanding that .

The series is not convergent in general; it is convergent (and the stated formula is valid) for all sufficiently small

X

and

Y

.Since, the term is zero if

sn>1

or if

sn=0

and

rn>1

.[10]

The first few terms are well-known, with all higher-order terms involving and commutator nestings thereof (thus in the Lie algebra):

The above lists all summands of order 6 or lower (i.e. those containing 6 or fewer 's and 's). The (anti-)/symmetry in alternating orders of the expansion, follows from . A complete elementary proof of this formula can be found in the article on the derivative of the exponential map.

An integral formula

There are numerous other expressions for

Z

, many of which are used in the physics literature.[11] A popular integral formula is[12] [13] \log\left(e^X e^Y\right) = X + \left (\int_0^1 \psi \left (e^ ~ e^\right) dt \right) Y, involving the generating function for the Bernoulli numbers, \psi(x) ~\stackrel ~ \frac= 1- \sum^\infty_ ~, utilized by Poincaré and Hausdorff.[14]

Matrix Lie group illustration

For a matrix Lie group

G\subGL(n,R)

the Lie algebra is the tangent space of the identity I, and the commutator is simply ; the exponential map is the standard exponential map of matrices,\exp X = e^X = \sum_^\infty .

When one solves for Z ine^Z = e^X e^Y,using the series expansions for and one obtains a simpler formula: Z =\sum_\frac\sum_\frac,\quad \|X\| + \|Y\| < \log 2, \|Z\| < \log 2.[15] The first, second, third, and fourth order terms are:

The formulas for the various

zj

's is not the Baker–Campbell–Hausdorff formula. Rather, the Baker–Campbell–Hausdorff formula is one of various expressions for

zj

's in terms of repeated commutators of

X

and

Y

. The point is that it is far from obvious that it is possible to express each

zj

in terms of commutators. (The reader is invited, for example, to verify by direct computation that

z3

is expressible as a linear combination of the two nontrivial third-order commutators of

X

and

Y

, namely

[X,[X,Y]]

and

[Y,[X,Y]]

.) The general result that each

zj

is expressible as a combination of commutators was shown in an elegant, recursive way by Eichler.

A consequence of the Baker–Campbell–Hausdorff formula is the following result about the trace:\operatorname \log \left(e^X e^Y \right) = \operatorname X + \operatorname Y. That is to say, since each

zj

with

j\geq2

is expressible as a linear combination of commutators, the trace of each such terms is zero.

Questions of convergence

Suppose

X

and

Y

are the following matrices in the Lie algebra

ak{sl}(2;C)

(the space of

2 x 2

matrices with trace zero):X=\begin0&i\pi\\ i\pi&0\end;\quad Y=\begin0&1\\ 0&0\end.Thene^X e^Y = \begin-1&0\\ 0&-1\end\begin1&1\\ 0&1\end=\begin-1&-1\\ 0&-1\end.It is then not hard to show[16] that there does not exist a matrix

Z

in

\operatorname{sl}(2;C)

with

eXeY=eZ

. (Similar examples may be found in the article of Wei.[17])

This simple example illustrates that the various versions of the Baker–Campbell–Hausdorff formula, which give expressions for in terms of iterated Lie-brackets of and, describe formal power series whose convergence is not guaranteed. Thus, if one wants to be an actual element of the Lie algebra containing and (as opposed to a formal power series), one has to assume that and are small. Thus, the conclusion that the product operation on a Lie group is determined by the Lie algebra is only a local statement. Indeed, the result cannot be global, because globally one can have nonisomorphic Lie groups with isomorphic Lie algebras.

Concretely, if working with a matrix Lie algebra and

\|\|

is a given submultiplicative matrix norm, convergence is guaranteed[18] if\|X\| + \|Y\| < \frac 2.

Special cases

If

X

and

Y

commute, that is

[X,Y]=0

, the Baker–Campbell–Hausdorff formula reduces to

eXeY=eX+Y

.

Another case assumes that

[X,Y]

commutes with both

X

and

Y

, as for the nilpotent Heisenberg group. Then the formula reduces to its first three terms.

Theorem:[19] If

X

and

Y

commute with their commutator,

[X,[X,Y]]=[Y,[X,Y]]=0

, then

eXeY=

X+Y+1[X,Y]
2
e
.

This is the degenerate case used routinely in quantum mechanics, as illustrated below and is sometimes known as the disentangling theorem.[20] In this case, there are no smallness restrictions on

X

and

Y

. This result is behind the "exponentiated commutation relations" that enter into the Stone–von Neumann theorem. A simple proof of this identity is given below.

Another useful form of the general formula emphasizes expansion in terms of Y and uses the adjoint mapping notation

\operatorname{ad}X(Y)=[X,Y]

:\log(\exp X\exp Y) = X + \frac ~ Y + O\left(Y^2\right)= X + \operatorname_ (1 + \coth \operatorname_) ~ Y + O\left(Y^2\right),which is evident from the integral formula above. (The coefficients of the nested commutators with a single

Y

are normalized Bernoulli numbers.)

Now assume that the commutator is a multiple of

Y

, so that

[X,Y]=sY

. Then all iterated commutators will be multiples of

Y

, and no quadratic or higher terms in

Y

appear. Thus, the

O\left(Y2\right)

term above vanishes and we obtain:

Theorem:[21] If

[X,Y]=sY

, where

s

is a complex number with

s2\piin

for all integers

n

, then we have e^X e^Y = \exp\left(X+\fracY\right).

Again, in this case there are no smallness restriction on

X

and

Y

. The restriction on

s

guarantees that the expression on the right side makes sense. (When

s=0

we may interpret \lim_ s/(1-e^) = 1.) We also obtain a simple "braiding identity":e^ e^ = e^ e^,which may be written as an adjoint dilation:e^ e^ e^ = e^ .

Existence results

If

X

and

Y

are matrices, one can compute

Z:=log\left(eXeY\right)

using the power series for the exponential and logarithm, with convergence of the series if

X

and

Y

are sufficiently small. It is natural to collect together all terms where the total degree in

X

and

Y

equals a fixed number

k

, giving an expression

zk

. (See the section "Matrix Lie group illustration" above for formulas for the first several

zk

's.) A remarkably direct and concise, recursive proof that each

zk

is expressible in terms of repeated commutators of

X

and

Y

was given by Martin Eichler.

akg,

defined over any field of characteristic 0 like

\Reals

or

\Complex

, thenZ = \log(\exp(X) \exp(Y)),can formally be written as an infinite sum of elements of

akg

. [This infinite series may or may not converge, so it need not define an actual element {{math|''Z''}} in <math>\mathfrak g</math>.] For many applications, the mere assurance of the existence of this formal expression is sufficient, and an explicit expression for this infinite sum is not needed. This is for instance the case in the Lorentzian[22] construction of a Lie group representation from a Lie algebra representation. Existence can be seen as follows.

We consider the ring

S=\R[[X,Y]]

of all non-commuting formal power series with real coefficients in the non-commuting variables and . There is a ring homomorphism from to the tensor product of with over,\Delta \colon S \to S \otimes S,called the coproduct, such that\Delta(X) = X \otimes 1 + 1 \otimes X and \Delta(Y) = Y \otimes 1 + 1 \otimes Y.(The definition of Δ is extended to the other elements of S by requiring R-linearity, multiplicativity and infinite additivity.)

One can then verify the following properties:

r=\exp(s)

is grouplike (this means

\Delta(r)=rr

) if and only if s is primitive (this means

\Delta(s)=s1+1 ⊗ s

).

[U,V]=UV-VU

. (Friedrichs' theorem[23] [24])

The existence of the Campbell–Baker–Hausdorff formula can now be seen as follows:The elements X and Y are primitive, so

\exp(X)

and

\exp(Y)

are grouplike; so their product

\exp(X)\exp(Y)

is also grouplike; so its logarithm

log(\exp(X)\exp(Y))

is primitive; and hence can be written as an infinite sum of elements of the Lie algebra generated by and .

The universal enveloping algebra of the free Lie algebra generated by and is isomorphic to the algebra of all non-commuting polynomials in and . In common with all universal enveloping algebras, it has a natural structure of a Hopf algebra, with a coproduct . The ring used above is just a completion of this Hopf algebra.

Zassenhaus formula

A related combinatoric expansion that is useful in dual applications ise^ = e^~ e^ ~e^ ~e^ ~e^ \cdotswhere the exponents of higher order in are likewise nested commutators, i.e., homogeneous Lie polynomials.[25] These exponents, in, follow recursively by application of the above BCH expansion.

As a corollary of this, the Suzuki–Trotter decomposition follows.

An important lemma and its application to a special case of the Baker–Campbell–Hausdorff formula

The identity (Campbell 1897)

Let be a matrix Lie group and its corresponding Lie algebra. Let be the linear operator on defined by for some fixed . (The adjoint endomorphism encountered above.) Denote with for fixed the linear transformation of given by .

A standard combinatorial lemma which is utilized in producing the above explicit expansions is given by[26] \operatorname_ = e^, so, explicitly, \operatorname_Y = e^Y e^ = e^ Y =Y+\left[X,Y\right]+\frac[X,[X,Y]]+\frac[X,[X,[X,Y]]]+\cdots.This is a particularly useful formula which is commonly used to conduct unitary transforms in quantum mechanics. By defining the iterated commutator,[(X)^n,Y] \equiv \underbrace

Notes and References

  1. F. Schur (1890), "Neue Begründung der Theorie der endlichen Transformationsgruppen," Mathematische Annalen, 35 (1890), 161–197. online copy
  2. see, e.g., Shlomo Sternberg, Lie Algebras (2004) Harvard University. (cf p 10.)
  3. [John Edward Campbell]
  4. [Henri Poincaré]
  5. [Henry Frederick Baker]
  6. [Felix Hausdorff]
  7. p. 23
  8. Martin. Eichler. Martin Eichler. 1968. A new proof of the Baker-Campbell-Hausdorff formula. Journal of the Mathematical Society of Japan. 20. 1–2 . 23–25. 10.2969/jmsj/02010023. free.
  9. Dynkin . Eugene Borisovich . Eugene Dynkin . 1947 . ru . Вычисление коэффициентов в формуле Campbell–Hausdorff . Calculation of the coefficients in the Campbell–Hausdorff formula . . 57 . 323–326.
  10. A.A. Sagle & R.E. Walde, "Introduction to Lie Groups and Lie Algebras", Academic Press, New York, 1973. .
  11. Suzuki, Masuo. 1985. Decomposition formulas of exponential operators and Lie exponentials with some applications to quantum mechanics and statistical physics. Journal of Mathematical Physics. 26. 4. 601–612 . 10.1063/1.526596. 1985JMP....26..601S .
    Veltman, M, 't Hooft, G & de Wit, B (2007), Appendix D.
  12. W. Miller, Symmetry Groups and their Applications, Academic Press, New York, 1972, pp 159–161.
  13. Theorem 5.3
  14. Recall\psi(e^y)=\sum_^\infty B_n ~ y^n/n!,for the Bernoulli numbers, B0 = 1, B1 = 1/2, B2 = 1/6,B4 = −1/30, ...
  15. Equation (2) Section 1.3. For matrix Lie algebras over the fields and, the convergence criterion is that the log series converges for both sides of . This is guaranteed whenever in the Hilbert–Schmidt norm. Convergence may occur on a larger domain. See p. 24.
  16. Example 3.41
  17. Wei. James . October 1963 . Note on the Global Validity of the Baker-Hausdorff and Magnus Theorems. . Journal of Mathematical Physics . 4 . 10 . 1337–1341. 1963JMP.....4.1337W . 10.1063/1.1703910 .
  18. Biagi. Stefano. Bonfiglioli. Andrea. Matone. Marco. 2018. On the Baker-Campbell-Hausdorff Theorem: non-convergence and prolongation issues. Linear and Multilinear Algebra. 68 . 7 . en. 1310–1328. 10.1080/03081087.2018.1540534. 0308-1087. 1805.10089. 53585331 .
  19. Theorem 5.1
  20. Book: Gerry . Christopher . Knight . Peter . Peter Knight (physicist) . 2005 . Introductory Quantum Optics . Cambridge University Press . 49 . 978-0-521-52735-4 . 1st.
  21. Exercise 5.5
  22. Section 5.7
  23. Magnus . Wilhelm . On the exponential solution of differential equations for a linear operator. 10.1002/cpa.3160070404 . Wilhelm Magnus. Communications on Pure and Applied Mathematics. 7 . 4 . 649–673. 1954.
  24. [Nathan Jacobson]
  25. Casas . F. . Murua . A. . Nadinic . M. . 10.1016/j.cpc.2012.06.006 . Efficient computation of the Zassenhaus formula . Computer Physics Communications . 183 . 11 . 2386–2391 . 2012 . 1204.0389 . 2012CoPhC.183.2386C . 2704520 .
  26. Proposition 3.35