Small control property explained

For applied mathematics, in nonlinear control theory, a non-linear system of the form

x

=f(x,u)

is said to satisfy the small control property if for every

\varepsilon>0

there exists a

\delta>0

so that for all

\|x\|<\delta

there exists a

\|u\|<\varepsilon

so that the time derivative of the system's Lyapunov function is negative definite at that point.

In other words, even if the control input is arbitrarily small, a starting configuration close enough to the origin of the system can be found that is asymptotically stabilizable by such an input.