Kell factor explained

The Kell factor, named after RCA engineer Raymond D. Kell,[1] is a parameter used to limit the bandwidth of a sampled image signal to avoid the appearance of beat frequency patterns when displaying the image in a discrete display device, usually taken to be 0.7. The number was first measured in 1934 by Raymond D. Kell and his associates as 0.64 but has suffered several revisions given that it is based on image perception, hence subjective, and is not independent of the type of display.[2] It was later revised to 0.85 but can go higher than 0.9, when fixed pixel scanning (e.g., CCD or CMOS) and fixed pixel displays (e.g., LCD or plasma) are used, or as low as 0.7 for electron gun scanning.

From a different perspective, the Kell factor defines the effective resolution of a discrete display device since the full resolution cannot be used without viewing experience degradation. The actual sampled resolution will depend on the spot size and intensity distribution. For electron gun scanning systems, the spot usually has a Gaussian intensity distribution. For CCDs, the distribution is somewhat rectangular, and is also affected by the sampling grid and inter-pixel spacing.

Kell factor is sometimes incorrectly stated to exist to account for the effects of interlacing. Interlacing itself does not affect Kell factor, but because interlaced video must be low-pass filtered (i.e., blurred) in the vertical dimension to avoid spatio-temporal aliasing (i.e., flickering effects), the Kell factor of interlaced video is said to be about 70% that of progressive video with the same scan line resolution.

The beat frequency problem

To understand how the distortion comes about, consider an ideal linear process from sampling to display. When a signal is sampled at a frequency that is at least double the Nyquist frequency, it can be fully reconstructed by low-pass filtering since the first repeat spectra does not overlap the original baseband spectra. In discrete displays the image signal is not low-pass filtered since the display takes discrete values as input, i.e. the signal displayed contains all the repeat spectra. The proximity of the highest frequency of the baseband signal to the lowest frequency of the first repeat spectra induces the beat frequency pattern. The pattern seen on screen can at times be similar to a Moiré pattern. The Kell factor is the reduction necessary in signal bandwidth such that no beat frequency is perceived by the viewer.

Examples

History

SourceKell factor
Kell, Bedford & Trainer (1934)0.64
Mertz & Gray (1934) 0.53
Wheeler & Loughren (1938) 0.71
Wilson (1938) 0.82
Kell, Bedford & Fredendall (1940)0.85
Baldwin (1940) 0.70

See also

References

Notes and References

  1. Kell . Bedford . Fredendall . July 1940 . A Determination Of Optimum Number Of Lines In A Television System . RCA Review . 5 . 1 . 8–30.
  2. Kell . R. D. . Bedford . A.V. . Trainer . M. A. . November 1934 . An Experimental Television System . Proceedings of the Institute of Radio Engineers . 22 . 11 . 1246.