In information theory, the Cheung - Marks theorem,[1] named after K. F. Cheung and Robert J. Marks II, specifies conditions[2] where restoration of a signal by the sampling theorem can become ill-posed. It offers conditions whereby "reconstruction error with unbounded variance [results] when a bounded variance noise is added to the samples."[3]
In the sampling theorem, the uncertainty of the interpolation as measured by noise variance is the same as the uncertainty of the sample data when the noise is i.i.d.[4] In his classic 1948 paper founding information theory, Claude Shannon offered the following generalization of the sampling theorem:[5]
Although true in the absence of noise, many of the expansions proposed by Shannon become ill-posed. An arbitrarily small amount of noise on the data renders restoration unstable. Such sampling expansions are not useful in practice since sampling noise, such as quantization noise, rules out stable interpolation and therefore any practical use.
Shannon's suggestion of simultaneous sampling of the signal and its derivative at half the Nyquist rate results in well behaved interpolation.[6] The Cheung - Marks theorem shows counter-intuitively that interlacing signal and derivative samples makes the restoration problem ill-posed.
The theorem also shows sensitivity increases with derivative order.[7]
Generally, the Cheung - Marks theorem shows the sampling theorem becomes ill-posed when the area (integral) of the squared magnitude of the interpolation function over all time is not finite."While the generalized sampling concept is relatively straightforward, the reconstruction is not always feasible because of potential instabilities."[8]