In mathematics, the Dirichlet–Jordan test gives sufficient conditions for a real-valued, periodic function f to be equal to the sum of its Fourier series at a point of continuity. Moreover, the behavior of the Fourier series at points of discontinuity is determined as well (it is the midpoint of the values of the discontinuity). It is one of many conditions for the convergence of Fourier series.
The original test was established by Peter Gustav Lejeune Dirichlet in 1829, for piecewise monotone functions (functions with a finite number of sections per period each of which is monotonic). It was extended in the late 19th century by Camille Jordan to functions of bounded variation in each period (any function of bounded variation is the difference of two monotonically increasing functions).
The Dirichlet–Jordan test states that if a periodic function
f(x)
Snf(x)
n\toinfty
f
x
f(x)
f
Stated in terms of a periodic function of period 2π, the Fourier series coefficients are defined asand the partial sums of the Fourier series are
The analogous statement holds irrespective of what the period of f is, or which version of the Fourier series is chosen.
There is also a pointwise version of the test:[1] if
f
L1
x
x
For the Fourier transform on the real line, there is a version of the test as well.[2] Suppose that
f(x)
L1(-infty,infty)
x
f
f(x)
This version of the test (although not satisfying modern demands for rigor) is historically prior to Dirichlet, being due to Joseph Fourier.
In signal processing,[3] the test is often retained in the original form due to Dirichlet: a piecewise monotone bounded periodic function
f
As in the pointwise case of the Jordan test, the condition of boundedness can be relaxed if the function is assumed to be absolutely integrable (i.e.,
L1
x
" A function that satisfies the Dirichlet conditions is also called piecewise monotone."