Unconditional convergence explained

In mathematics, specifically functional analysis, a series is unconditionally convergent if all reorderings of the series converge to the same value. In contrast, a series is conditionally convergent if it converges but different orderings do not all converge to that same value. Unconditional convergence is equivalent to absolute convergence in finite-dimensional vector spaces, but is a weaker property in infinite dimensions.

Definition

Let

X

be a topological vector space. Let

I

be an index set and

xi\inX

for all

i\inI.

The series

style\sumixi

is called unconditionally convergent to

x\inX,

if

I0:=\left\{i\inI:xi0\right\}

is countable, and

\sigma:I0\toI0

of

I0=\left\{ik\right\}

infty
k=1
the following relation holds:
infty
\sum
k=1
x
\sigma\left(ik\right)

=x.

Alternative definition

Unconditional convergence is often defined in an equivalent way: A series is unconditionally convergent if for every sequence

\left(\varepsilonn\right)

infty,
n=1
with

\varepsilonn\in\{-1,+1\},

the series\sum_^\infty \varepsilon_n x_nconverges.

If

X

is a Banach space, every absolutely convergent series is unconditionally convergent, but the converse implication does not hold in general. Indeed, if

X

is an infinite-dimensional Banach space, then by Dvoretzky - Rogers theorem there always exists an unconditionally convergent series in this space that is not absolutely convergent. However, when

X=\Rn,

by the Riemann series theorem, the series \sum_n x_n is unconditionally convergent if and only if it is absolutely convergent.

References