In mathematics, the topological entropy of a topological dynamical system is a nonnegative extended real number that is a measure of the complexity of the system. Topological entropy was first introduced in 1965 by Adler, Konheim and McAndrew. Their definition was modelled after the definition of the Kolmogorov–Sinai, or metric entropy. Later, Dinaburg and Rufus Bowen gave a different, weaker definition reminiscent of the Hausdorff dimension. The second definition clarified the meaning of the topological entropy: for a system given by an iterated function, the topological entropy represents the exponential growth rate of the number of distinguishable orbits of the iterates. An important variational principle relates the notions of topological and measure-theoretic entropy.
A topological dynamical system consists of a Hausdorff topological space X (usually assumed to be compact) and a continuous self-map f : X → X. Its topological entropy is a nonnegative extended real number that can be defined in various ways, which are known to be equivalent.
Let X be a compact Hausdorff topological space. For any finite open cover C of X, let H(C) be the logarithm (usually to base 2) of the smallest number of elements of C that cover X.[1] For two covers C and D, let
C\veeD
For any continuous map f: X → X, the following limit exists:
H(f,C)=\limn\toinfty
1 | |
n |
H(C\veef-1C\vee\ldots\veef-n+1C).
Then the topological entropy of f, denoted h(f), is defined to be the supremum of H(f,C) over all possible finite covers C of X.
The parts of C may be viewed as symbols that (partially) describe the position of a point x in X: all points x ∈ Ci are assigned the symbol Ci . Imagine that the position of x is (imperfectly) measured by a certain device and that each part of C corresponds to one possible outcome of the measurement.
H(C\veef-1C\vee\ldots\veef-n+1C)
This definition [2] [3] [4] uses a metric on X (actually, a uniform structure would suffice). This is a narrower definition than that of Adler, Konheim, and McAndrew,[5] as it requires the additional metric structure on the topological space (but is independent of the choice of metrics generating the given topology). However, in practice, the Bowen-Dinaburg topological entropy is usually much easier to calculate.
Let (X, d) be a compact metric space and f: X → X be a continuous map. For each natural number n, a new metric dn is defined on X by the formula
i(x),f | |
d | |
n(x,y)=max\{d(f |
i(y)):0\leqi<n\}.
Given any ε > 0 and n ≥ 1, two points of X are ε-close with respect to this metric if their first n iterates are ε-close. This metric allows one to distinguish in a neighborhood of an orbit the points that move away from each other during the iteration from the points that travel together. A subset E of X is said to be (n, ε)-separated if each pair of distinct points of E is at least ε apart in the metric dn. Denote by N(n, ε) the maximum cardinality of an (n, ε)-separated set. The topological entropy of the map f is defined by
h(f)=\lim\epsilon\to\left(\limsupn\to
1 | |
n |
logN(n,\epsilon)\right).
Since X is compact, N(n, ε) is finite and represents the number of distinguishable orbit segments of length n, assuming that we cannot distinguish points within ε of one another. A straightforward argument shows that the limit defining h(f) always exists in the extended real line (but could be infinite). This limit may be interpreted as the measure of the average exponential growth of the number of distinguishable orbit segments. In this sense, it measures complexity of the topological dynamical system (X, f). Rufus Bowen extended this definition of topological entropy in a way which permits X to be non-compact under the assumption that the map f is uniformly continuous.
f
X
C
f
C
f
h(f)=H(f,C).
f:X → X
X
h\mu(f)
f
\mu
M(X,f)
f
h(f)=\sup\muh\mu(f)
h\mu
M(X,f)
\mu\mapstoh\mu(f):M(X,f) → \R
\mu
M(X,f)
h\mu(f)=h(f)
f
\mu
f
\mu
\sigma:\Sigmak → \Sigmak
xn\mapstoxn-1
\{1,...,k\}
C=\{[1],...,[k]\}
\Sigmak
n-1 | |
vee | |
j=0 |
\sigma-(C)
\Sigmak
n\in\N
kn
C
h(\sigma)=H(\sigma,C)=\limn
1 | |
n |
logkn=logk
\left( | 1 |
k, |
...,
1 | |
k |
\right)
logk
A
k x k
\{0,1\}
\sigma:\SigmaA → \SigmaA
h(\sigma)=logλ
λ
A