In applied mathematics, highly optimized tolerance (HOT) is a method of generating power law behavior in systems by including a global optimization principle. It was developed by Jean M. Carlson and John Doyle in the early 2000s.[1] For some systems that display a characteristic scale, a global optimization term could potentially be added that would then yield power law behavior. It has been used to generate and describe internet-like graphs, forest fire models and may also apply to biological systems.
The following is taken from Sornette's book.
Consider a random variable,
X
xi
pi
ri
xi=
-\beta | |
r | |
i |
\beta
L=
N-1 | |
\sum | |
i=0 |
pixi
N-1 | |
\sum | |
i=0 |
ri=\kappa
pi\propto
-(1+1/\beta) | |
x | |
i |
xi
ri