A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user controls.
The display in modern monitors is typically an LCD with LED backlight, having by the 2010s replaced CCFL backlit LCDs. Before the mid-2000s, most monitors used a cathode-ray tube (CRT) as the image output technology.[1] A monitor is typically connected to its host computer via DisplayPort, HDMI, USB-C, DVI, or VGA. Monitors sometimes use other proprietary connectors and signals to connect to a computer, which is less common.
Originally computer monitors were used for data processing while television sets were used for video. From the 1980s onward, computers (and their monitors) have been used for both data processing and video, while televisions have implemented some computer functionality. In the 2000s, the typical display aspect ratio of both televisions and computer monitors changed from 4:3 to 16:9.[2] [3]
Modern computer monitors are often functionally interchangeable with television sets and vice versa. As most computer monitors do not include integrated speakers, TV tuners, or remote controls, external components such as a DTA box may be needed to use a computer monitor as a TV set.[4] [5]
Early electronic computer front panels were fitted with an array of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited amount of information and were very transient, they were rarely considered for program output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the program's operation.[6]
Computer monitors were formerly known as visual display units (VDU), particularly in British English.[7] This term mostly fell out of use by the 1990s.
Multiple technologies have been used for computer monitors. Until the 21st century most used cathode-ray tubes but they have largely been superseded by LCD monitors.
See main article: Cathode-ray tube. The first computer monitors used cathode-ray tubes (CRTs). Prior to the advent of home computers in the late 1970s, it was common for a video display terminal (VDT) using a CRT to be physically integrated with a keyboard and other components of the workstation in a single large chassis, typically limiting them to emulation of a paper teletypewriter, thus the early epithet of 'glass TTY'. The display was monochromatic and far less sharp and detailed than on a modern monitor, necessitating the use of relatively large text and severely limiting the amount of information that could be displayed at one time. High-resolution CRT displays were developed for specialized military, industrial and scientific applications but they were far too costly for general use; wider commercial use became possible after the release of a slow, but affordable Tektronix 4010 terminal in 1972.
Some of the earliest home computers (such as the TRS-80 and Commodore PET) were limited to monochrome CRT displays, but color display capability was already a possible feature for a few MOS 6500 series-based machines (such as introduced in 1977 Apple II computer or Atari 2600 console), and the color output was a specialty of the more graphically sophisticated Atari 8-bit computers, introduced in 1979. Either computer could be connected to the antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum resolution and color quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter, which could display four colors with a resolution of pixels, or it could produce pixels with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of .[8]
By the end of the 1980s color progressive scan CRT monitors were widely available and increasingly affordable, while the sharpest prosumer monitors could clearly display high-definition video, against the backdrop of efforts at HDTV standardization from the 1970s to the 1980s failing continuously, leaving consumer SDTVs to stagnate increasingly far behind the capabilities of computer CRT monitors well into the 2000s. During the following decade, maximum display resolutions gradually increased and prices continued to fall as CRT technology remained dominant in the PC monitor market into the new millennium, partly because it remained cheaper to produce.[9] CRTs still offer color, grayscale, motion, and latency advantages over today's LCDs, but improvements to the latter have made them much less obvious. The dynamic range of early LCD panels was very poor, and although text and other motionless graphics were sharper than on a CRT, an LCD characteristic known as pixel lag caused moving graphics to appear noticeably smeared and blurry.
See main article: Liquid-crystal display and Thin-film-transistor liquid-crystal display. There are multiple technologies that have been used to implement liquid-crystal displays (LCD). Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly, the same laptop would be offered with an assortment of display options at increasing price points: (active or passive) monochrome, passive color, or active matrix color (TFT). As volume and manufacturing capability have improved, the monochrome and passive color technologies were dropped from most product lines.
TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.[10]
The first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined they became more popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors were the Eizo FlexScan L66 in the mid-1990s, the SGI 1600SW, Apple Studio Display and the ViewSonic VP140[11] in 1998. In 2003, LCDs outsold CRTs for the first time, becoming the primary technology used for computer monitors.[9] The physical advantages of LCD over CRT monitors are that LCDs are lighter, smaller, and consume less power. In terms of performance, LCDs produce less or no flicker, reducing eyestrain,[12] sharper image at native resolution, and better checkerboard contrast. On the other hand, CRT monitors have superior blacks, viewing angles, and response time, can use arbitrary lower resolutions without aliasing, and flicker can be reduced with higher refresh rates,[13] though this flicker can also be used to reduce motion blur compared to less flickery displays such as most LCDs.[14] Many specialized fields such as vision science remain dependent on CRTs, the best LCD monitors having achieved moderate temporal accuracy, and so can be used only if their poor spatial accuracy is unimportant.[15]
High dynamic range (HDR)[13] has been implemented into high-end LCD monitors to improve grayscale accuracy. Since around the late 2000s, widescreen LCD monitors have become popular, in part due to television series, motion pictures and video games transitioning to widescreen, which makes squarer monitors unsuited to display them correctly.
See main article: Organic light-emitting diode.
Organic light-emitting diode (OLED) monitors provide most of the benefits of both LCD and CRT monitors with few of their drawbacks, though much like plasma panels or very early CRTs they suffer from burn-in, and remain very expensive.
The performance of a monitor is measured by the following parameters:
See main article: Display size. On two-dimensional display devices such as computer monitors the display size or viewable image size is the actual amount of screen space that is available to display a picture, video or working space, without obstruction from the bezel or other aspects of the unit's design. The main measurements for display devices are width, height, total area and the diagonal.
The size of a display is usually given by manufacturers diagonally, i.e. as the distance between two opposite screen corners. This method of measurement is inherited from the method used for the first generation of CRT television when picture tubes with circular faces were in common use. Being circular, it was the external diameter of the glass envelope that described their size. Since these circular tubes were used to display rectangular images, the diagonal measurement of the rectangular image was smaller than the diameter of the tube's face (due to the thickness of the glass). This method continued even when cathode-ray tubes were manufactured as rounded rectangles; it had the advantage of being a single number specifying the size and was not confusing when the aspect ratio was universally 4:3.
With the introduction of flat panel technology, the diagonal measurement became the actual diagonal of the visible display. This meant that an eighteen-inch LCD had a larger viewable area than an eighteen-inch cathode-ray tube.
Estimation of monitor size by the distance between opposite corners does not take into account the display aspect ratio, so that for example a 16:9 21inches widescreen display has less area, than a 21inches 4:3 screen. The 4:3 screen has dimensions of 16.8× and an area 211sqin, while the widescreen is 18.3×, 188sqin.
See main article: Display aspect ratio. Until about 2003, most computer monitors had a aspect ratio and some had . Between 2003 and 2006, monitors with and mostly (8:5) aspect ratios became commonly available, first in laptops and later also in standalone monitors. Reasons for this transition included productive uses (i.e. field of view in video games and movie viewing) such as the word processor display of two standard letter pages side by side, as well as CAD displays of large-size drawings and application menus at the same time.[17] [18] In 2008 16:10 became the most common sold aspect ratio for LCD monitors and the same year 16:10 was the mainstream standard for laptops and notebook computers.[19]
In 2010, the computer industry started to move over from to because 16:9 was chosen to be the standard high-definition television display size, and because they were cheaper to manufacture.
In 2011, non-widescreen displays with 4:3 aspect ratios were only being manufactured in small quantities. According to Samsung, this was because the "Demand for the old 'Square monitors' has decreased rapidly over the last couple of years," and "I predict that by the end of 2011, production on all 4:3 or similar panels will be halted due to a lack of demand."[20]
See main article: Display resolution. The resolution for computer monitors has increased over time. From during the late 1970s, to during the late 1990s. Since 2009, the most commonly sold resolution for computer monitors is, shared with the 1080p of HDTV.[21] Before 2013 mass market LCD monitors were limited to at 30inches, excluding niche professional monitors. By 2015 most major display manufacturers had released (4K UHD) displays, and the first (8K) monitors had begun shipping.
See main article: Gamut. Every RGB monitor has its own color gamut, bounded in chromaticity by a color triangle. Some of these triangles are smaller than the sRGB triangle, some are larger. Colors are typically encoded by 8 bits per primary color. The RGB value [255, 0, 0] represents red, but slightly different colors in different color spaces such as Adobe RGB and sRGB. Displaying sRGB-encoded data on wide-gamut devices can give an unrealistic result.[22] The gamut is a property of the monitor; the image color space can be forwarded as Exif metadata in the picture. As long as the monitor gamut is wider than the color space gamut, correct display is possible, if the monitor is calibrated. A picture that uses colors that are outside the sRGB color space will display on an sRGB color space monitor with limitations.[23] Still today, many monitors that can display the sRGB color space are not factory nor user-calibrated to display it correctly. Color management is needed both in electronic publishing (via the Internet for display in browsers) and in desktop publishing targeted to print.
Power savingMost modern monitors will switch to a power-saving mode if no video-input signal is received. This allows modern operating systems to turn off a monitor after a specified period of inactivity. This also extends the monitor's service life. Some monitors will also switch themselves off after a time period on standby.
Most modern laptops provide a method of screen dimming after periods of inactivity or when the battery is in use. This extends battery life and reduces wear.
Indicator lightMost modern monitors have two different indicator light colors wherein if video-input signal was detected, the indicator light is green and when the monitor is in power-saving mode, the screen is black and the indicator light is orange. Some monitors have different indicator light colors and some monitors have a blinking indicator light when in power-saving mode.
Integrated accessoriesMany monitors have other accessories (or connections for them) integrated. This places standard ports within easy reach and eliminates the need for another separate hub, camera, microphone, or set of speakers. These monitors have advanced microprocessors which contain codec information, Windows interface drivers and other small software which help in proper functioning of these functions.
Ultrawide screens
See main article: 21:9 aspect ratio. Monitors that feature an aspect ratio greater than 2:1 (for instance, 21:9 or 32:9, as opposed to the more common 16:9, which resolves to 1.7:1).Monitors with an aspect ratio greater than 3:1 are marketed as super ultrawide monitors. These are typically massive curved screens intended to replace a multi-monitor deployment.
Touch screen
See main article: Touchscreen. These monitors use touching of the screen as an input method. Items can be selected or moved with a finger, and finger gestures may be used to convey commands. The screen will need frequent cleaning due to image degradation from fingerprints.
Sensors
Glossy screen
See main article: Glossy display. Some displays, especially newer flat panel monitors, replace the traditional anti-glare matte finish with a glossy one. This increases color saturation and sharpness but reflections from lights and windows are more visible. Anti-reflective coatings are sometimes applied to help reduce reflections, although this only partly mitigates the problem.
Curved designs
See main article: Curved screen. Most often using nominally flat-panel display technology such as LCD or OLED, a concave rather than convex curve is imparted, reducing geometric distortion, especially in extremely large and wide seamless desktop monitors intended for close viewing range.
3D
See main article: Stereo display.
See also: Active shutter 3D system, Polarized 3D system and Autostereoscopy. Newer monitors are able to display a different image for each eye, often with the help of special glasses and polarizers, giving the perception of depth. An autostereoscopic screen can generate 3D images without headgear.
Anti-glare and anti-reflection screensFeatures for medical using or for outdoor placement.
Directional screenNarrow viewing angle screens are used in some security-conscious applications.
Integrated professional accessoriesIntegrated screen calibration tools, screen hoods, signal transmitters; Protective screens.
Tablet screens
See main article: Graphics tablet/screen hybrid. A combination of a monitor with a graphics tablet. Such devices are typically unresponsive to touch without the use of one or more special tools' pressure. Newer models however are now able to detect touch from any pressure and often have the ability to detect tool tilt and rotation as well.
Touch and tablet sensors are often used on sample and hold displays such as LCDs to substitute for the light pen, which can only work on CRTs.
Integrated display LUT and 3D LUT tablesThe option for using the display as a reference monitor; these calibration features can give an advanced color management control for take a near-perfect image.
Local dimming backlightOption for professional LCD monitors, inherent to OLED & CRT; professional feature with mainstream tendency.
Backlight brightness/color uniformity compensationNear to mainstream professional feature; advanced hardware driver for backlit modules with local zones of uniformity correction.
Computer monitors are provided with a variety of methods for mounting them depending on the application and environment.
Raw monitors are raw framed LCD monitors,[24] to install a monitor on a not so common place, ie, on the car door or you need it in the trunk. It is usually paired with a power adapter to have a versatile monitor for home or commercial use.
A desktop monitor is typically provided with a stand from the manufacturer which lifts the monitor up to a more ergonomic viewing height. The stand may be attached to the monitor using a proprietary method or may use, or be adaptable to, a VESA mount. A VESA standard mount allows the monitor to be used with more after-market stands if the original stand is removed. Stands may be fixed or offer a variety of features such as height adjustment, horizontal swivel, and landscape or portrait screen orientation.
See main article: Flat Display Mounting Interface. The Flat Display Mounting Interface (FDMI), also known as VESA Mounting Interface Standard (MIS) or colloquially as a VESA mount, is a family of standards defined by the Video Electronics Standards Association for mounting flat panel displays to stands or wall mounts.[25] It is implemented on most modern flat-panel monitors and TVs.
For computer monitors, the VESA Mount typically consists of four threaded holes on the rear of the display that will mate with an adapter bracket.
Rack mount computer monitors are available in two styles and are intended to be mounted into a 19-inch rack:
A panel mount computer monitor is intended for mounting into a flat surface with the front of the display unit protruding just slightly. They may also be mounted to the rear of the panel. A flange is provided around the screen, sides, top and bottom, to allow mounting. This contrasts with a rack mount display where the flanges are only on the sides. The flanges will be provided with holes for thru-bolts or may have studs welded to the rear surface to secure the unit in the hole in the panel. Often a gasket is provided to provide a water-tight seal to the panel and the front of the screen will be sealed to the back of the front panel to prevent water and dirt contamination.
An open frame monitor provides the display and enough supporting structure to hold associated electronics and to minimally support the display. Provision will be made for attaching the unit to some external structure for support and protection. Open frame monitors are intended to be built into some other piece of equipment providing its own case. An arcade video game would be a good example with the display mounted inside the cabinet. There is usually an open frame display inside all end-use displays with the end-use display simply providing an attractive protective enclosure. Some rack mount monitor manufacturers will purchase desktop displays, take them apart, and discard the outer plastic parts, keeping the inner open-frame display for inclusion into their product.
According to an NSA document leaked to, the NSA sometimes swaps the monitor cables on targeted computers with a bugged monitor cable to allow the NSA to remotely see what is being displayed on the targeted computer monitor.[26]
Van Eck phreaking is the process of remotely displaying the contents of a CRT or LCD by detecting its electromagnetic emissions. It is named after Dutch computer researcher Wim van Eck, who in 1985 published the first paper on it, including proof of concept. Phreaking more generally is the process of exploiting telephone networks.[27]