Heaviside step | |
General Definition: | |
Fields Of Application: | Operational calculus |
The Heaviside step function, or the unit step function, usually denoted by or (but sometimes, or), is a step function named after Oliver Heaviside, the value of which is zero for negative arguments and one for positive arguments. Different conventions concerning the value are in use. It is an example of the general class of step functions, all of which can be represented as linear combinations of translations of this one.
The function was originally developed in operational calculus for the solution of differential equations, where it represents a signal that switches on at a specified time and stays switched on indefinitely. Oliver Heaviside, who developed the operational calculus as a tool in the analysis of telegraphic communications, represented the function as .
Taking the convention that, the Heaviside function may be defined as:
The Dirac delta function is the derivative of the Heaviside function:Hence the Heaviside function can be considered to be the integral of the Dirac delta function. This is sometimes written asalthough this expansion may not hold (or even make sense) for, depending on which formalism one uses to give meaning to integrals involving . In this context, the Heaviside function is the cumulative distribution function of a random variable which is almost surely 0. (See Constant random variable.)
Approximations to the Heaviside step function are of use in biochemistry and neuroscience, where logistic approximations of step functions (such as the Hill and the Michaelis–Menten equations) may be used to approximate binary cellular switches in response to chemical signals.
For a smooth approximation to the step function, one can use the logistic function
where a larger corresponds to a sharper transition at . If we take, equality holds in the limit:
There are many other smooth, analytic approximations to the step function. Among the possibilities are:
These limits hold pointwise and in the sense of distributions. In general, however, pointwise convergence need not imply distributional convergence, and vice versa distributional convergence need not imply pointwise convergence. (However, if all members of a pointwise convergent sequence of functions are uniformly bounded by some "nice" function, then convergence holds in the sense of distributions too.)
In general, any cumulative distribution function of a continuous probability distribution that is peaked around zero and has a parameter that controls for variance can serve as an approximation, in the limit as the variance approaches zero. For example, all three of the above approximations are cumulative distribution functions of common probability distributions: the logistic, Cauchy and normal distributions, respectively.
Often an integral representation of the Heaviside step function is useful:
where the second representation is easy to deduce from the first, given that the step function is real and thus is its own complex conjugate.
Since is usually used in integration, and the value of a function at a single point does not affect its integral, it rarely matters what particular value is chosen of . Indeed when is considered as a distribution or an element of (see space) it does not even make sense to talk of a value at zero, since such objects are only defined almost everywhere. If using some analytic approximation (as in the examples above) then often whatever happens to be the relevant limit at zero is used.
There exist various reasons for choosing a particular value.