Ramp function explained

The ramp function is a unary real function, whose graph is shaped like a ramp. It can be expressed by numerous definitions, for example "0 for negative inputs, output equals input for non-negative inputs". The term "ramp" can also be used for other functions obtained by scaling and shifting, and the function in this article is the unit ramp function (slope 1, starting at 0).

In mathematics, the ramp function is also known as the positive part.

In machine learning, it is commonly known as a ReLU activation function[1] [2] or a rectifier in analogy to half-wave rectification in electrical engineering. In statistics (when used as a likelihood function) it is known as a tobit model.

This function has numerous applications in mathematics and engineering, and goes by various names, depending on the context. There are differentiable variants of the ramp function.

Definitions

The ramp function may be defined analytically in several ways. Possible definitions are:

x, & x \ge 0; \\0, & x<0\end

It could approximated as close as desired by choosing an increasing positive value

a>0

.

Applications

The ramp function has numerous applications in engineering, such as in the theory of digital signal processing.

In finance, the payoff of a call option is a ramp (shifted by strike price). Horizontally flipping a ramp yields a put option, while vertically flipping (taking the negative) corresponds to selling or being "short" an option. In finance, the shape is widely called a "hockey stick", due to the shape being similar to an ice hockey stick.

In statistics, hinge functions of multivariate adaptive regression splines (MARS) are ramps, and are used to build regression models.

Analytic properties

Non-negativity

In the whole domain the function is non-negative, so its absolute value is itself, i.e.\forall x \in \Reals: R(x) \geq 0 and\left| R (x) \right| = R(x)

Derivative

Its derivative is the Heaviside step function:R'(x) = H(x)\quad \mbox x \ne 0.

Second derivative

The ramp function satisfies the differential equation: \frac R(x - x_0) = \delta(x - x_0), where is the Dirac delta. This means that is a Green's function for the second derivative operator. Thus, any function,, with an integrable second derivative,, will satisfy the equation: f(x) = f(a) + (x-a) f'(a) + \int_^b R(x - s) f(s) \,ds \quad \mboxa < x < b .

\mathcal\big\(f) = \int_^ R(x) e^ \, dx = \frac-\frac, where is the Dirac delta (in this formula, its derivative appears).

The single-sided Laplace transform of is given as follows,[3] \mathcal\big\ (s) = \int_^ e^R(x)dx = \frac.

Algebraic properties

Iteration invariance

Every iterated function of the ramp mapping is itself, as R \big(R(x) \big) = R(x) .

See also

Notes and References

  1. Web site: Brownlee . Jason . A Gentle Introduction to the Rectified Linear Unit (ReLU) . Machine Learning Mastery . 8 April 2021 . 8 January 2019.
  2. Web site: Liu . Danqing . A Practical Guide to ReLU . Medium . 8 April 2021 . en . 30 November 2017.
  3. Web site: The Laplace Transform of Functions. lpsa.swarthmore.edu . 2019-04-05.