Logical depth explained
Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm.
Informally, the logical depth of a string
to a significance level
is the time required to compute
by a program no more than
bits longer than the shortest program that computes
.
[1] Formally, let
be the shortest program that computes a string
on some universal computer
. Then the logical depth of
to the significance level
is given by
min\{T(p):(|p|-|p*|<s)\wedge(U(p)=x)\},
where
is the number of computation steps that
made on
to produce
and halt.
See also
Notes and References
- Antunes . Luís . Bauwens . Bruno . Souto . André . Teixeira . Andreia . 2017-02-01 . Sophistication vs Logical Depth . Theory of Computing Systems . en . 60 . 2 . 280–298 . 10.1007/s00224-016-9672-6 . 253745548 . 1433-0490. 1304.8046 .