In mathematics,
infty | |
\sum | |
k=0 |
(-1)kk!
This series was first considered by Euler, who applied summability methods to assign a finite value to the series.[1] The series is a sum of factorials that are alternately added or subtracted. One way to assign a value to this divergent series is by using Borel summation, where one formally writes
infty | |
\sum | |
k=0 |
(-1)kk!=
infty | |
\sum | |
k=0 |
(-1)k
infty | |
\int | |
0 |
xke-xdx.
If summation and integration are interchanged (ignoring that neither side converges), one obtains:
infty | |
\sum | |
k=0 |
(-1)kk!=
infty | |
\int | |
0 |
infty | |
\left[\sum | |
k=0 |
(-x)k\right]e-xdx.
The summation in the square brackets converges when
|x|<1
\tfrac{1}{1+x}
\tfrac{1}{1+x}
x
infty | |
\begin{align} \sum | |
k=0 |
(-1)kk!&=
infty | |
\int | |
0 |
e-x | |
1+x |
dx\\[4pt] &=eE1(1) ≈ 0.596347362323194074341078499369\ldots \end{align}
where E1(z) is the exponential integral. This is by definition the Borel sum of the series, and is equal to the Gompertz constant.
Consider the coupled system of differential equations
x |
(t)=x(t)-y(t),
y |
(t)=-y(t)2
where dots denote derivatives with respect to t.
The solution with stable equilibrium at as t → ∞ has y(t) = , and substituting it into the first equation gives a formal series solution
x(t)=
infty | |
\sum | |
n=1 |
(-1)n+1
(n-1)! | |
tn |
Observe x(1) is precisely Euler's series.
On the other hand, the system of differential equations has a solution
x(t)=et
infty | |
\int | |
t |
e-u | |
u |
du.
By successively integrating by parts, the formal power series is recovered as an asymptotic approximation to this expression for x(t). Euler argues (more or less) that since the formal series and the integral both describe the same solution to the differential equations, they should equal each other at
t=1
infty | |
\sum | |
n=1 |
(-1)n+1(n-1)!=
infty | |
e\int | |
1 |
e-u | |
u |
du.