Video Graphics Array should not be confused with Variable gauge.
Video Graphics Array |
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987,[1] [2] [3] which became ubiquitous in the IBM PC compatible industry within three years.[4] The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the resolution characteristic of the VGA hardware.[5]
VGA was the last IBM graphics standard to which the majority of IBM PC compatible computer manufacturers conformed, making it the lowest common denominator that virtually all post-1990 PC graphics hardware can be expected to implement.[6]
VGA was adapted into many extended forms by third parties, collectively known as Super VGA,[7] then gave way to custom graphics processing units which, in addition to their proprietary interfaces and capabilities, continue to implement common VGA graphics modes and interfaces to the present day.
The VGA analog interface standard has been extended to support resolutions of up to for general usage, with specialized applications improving it further still.[8]
The color palette random access memory (RAM) and its corresponding digital-to-analog converter (DAC) were integrated into one chip (the RAMDAC) and the cathode-ray tube controller (CRTC) was integrated into a main VGA chip, which eliminated several other chips in previous graphics adapters, so VGA only additionally required external video RAM and timing crystals.[9] [10]
This small part count allowed IBM to include VGA directly on the PS/2 motherboard, in contrast to prior IBM PC modelsPC, PC/XT, and PC ATwhich required a separate display adapter installed in a slot in order to connect a monitor. The term "array" rather than "adapter" in the name denoted that it was not a complete independent expansion device, but a single component that could be integrated into a system.
Unlike the graphics adapters that preceded it (MDA, CGA, EGA and many third-party options) there was initially no discrete VGA card released by IBM. The first commercial implementation of VGA was a built-in component of the IBM PS/2, in which it was accompanied by 256 KB of video RAM, and a new DE-15 connector replacing the DE-9 used by previous graphics adapters. IBM later released the standalone IBM PS/2 Display Adapter, which utilized the VGA but could be added to machines that did not have it built in.[11] [12]
The VGA supports all graphics modes supported by the MDA, CGA and EGA cards, as well as multiple new modes.
The 16-color and 256-color modes had fully redefinable palettes, with each entry selected from an 18-bit (262,144-color) gamut.[15] [16] [17] [18]
The other modes defaulted to standard EGA or CGA compatible palettes and instructions, but still permitted remapping of the palette with VGA-specific commands.
The resolution (at 256 colors rather than 16) was originally used by IBM in PGC graphics (which VGA offers no backward compatibility for) but did not see wide adoption until VGA was introduced. As the VGA began to be cloned in great quantities by manufacturers who added ever-increasing capabilities, its, 16-color mode became the de facto lowest common denominator of graphics cards. By the mid 1990s, a ×16 graphics mode using the VGA memory and register specifications was expected by operating systems such as Windows 95 and OS/2 Warp 3.0, which provided no support for lower resolutions or bit depths, or support for other memory or register layouts without additional drivers. Well into the 2000s, even after the VESA standard for graphics cards became commonplace, the "VGA" graphics mode remained a compatibility option for PC operating systems.
Nonstandard display modes can be implemented, with horizontal resolutions of:
And heights of:
For example, high resolution modes with square pixels are available at or in 16 colors, or medium-low resolution at with 256 colors. Alternatively, extended resolution is available with "fat" pixels and 256 colors using, e.g. (50 Hz) or (60 Hz), and "thin" pixels, 16 colors and the 70 Hz refresh rate with e.g. mode.
"Narrow" modes such as tend to preserve the same pixel ratio as in e.g. mode unless the monitor is adjusted to stretch the image out to fill the screen, as they are derived simply by masking down the wider mode instead of altering pixel or line timings, but can be useful for reducing memory requirements and pixel addressing calculations for arcade game conversions or console emulators.
The PC version of Pinball Fantasies has the option to use non-standard modes "high res" modes, such as, allowing it to display a larger portion of the pinball table on screen.[19]
See also: VGA text mode. VGA also implements several text modes:
As with the pixel-based graphics modes, additional text modes are possible by programming the VGA correctly, with an overall maximum of about cells and an active area spanning about cells.
One variant that is sometimes seen is or, using an or font and an effective pixel display, which trades use of the more flickery 60 Hz mode for an additional 5 or 10 lines of text and square character blocks (or, at, square half-blocks).
Unlike the cards that preceded it, which used binary TTL signals to interface with a monitor (and also composite, in the case of the CGA), the VGA introduced a video interface using pure analog RGB signals, with a range of 0.7 volts peak-to-peak max. In conjunction with a 18-bit RAMDAC (6-bit per RGB channel), this produced a color gamut of 262,144 colors.
The original VGA specifications follow:
The intended standard value for the horizontal frequency of VGA's mode is exactly double the value used in the NTSC-M video system, as this made it much easier to offer optional TV-out solutions or external VGA-to-TV converter boxes at the time of VGA's development. It is also at least nominally twice that of CGA, which also supported composite monitors.
All derived VGA timings (i.e. those which use the master 25.175 and 28.322 MHz crystals and, to a lesser extent, the nominal 31.469 kHz line rate) can be varied by software that bypasses the VGA firmware interface and communicates directly with the VGA hardware, as many MS-DOS based games did. However, only the standard modes, or modes that at least use almost exactly the same H-sync and V-sync timings as one of the standard modes, can be expected to work with the original late-1980s and early-1990s VGA monitors. The use of other timings may in fact damage such monitors and thus was usually avoided by software publishers.
Third-party "multisync" CRT monitors were more flexible, and in combination with "super EGA", VGA, and later SVGA graphics cards using extended modes, could display a much wider range of resolutions and refresh rates at arbitrary sync frequencies and pixel clock rates.
For the most common VGA mode (60 Hz, non-interlaced), the horizontal timings can be found in the HP Super VGA Display Installation Guide and in other places.[24] [25]
See also: Extended Display Identification Data. @ 70 Hz is traditionally the video mode used for booting VGA-compatible x86 personal computers[26] that show a graphical boot screen, while text-mode boot uses @ 70 Hz.
This convention has been eroded in recent years, however, with POST and BIOS screens moving to higher resolutions, taking advantage of EDID data to match the resolution to a connected monitor.
@ 60 Hz is the default Windows graphics mode (usually with 16 colors), up to Windows 2000. It remains an option in XP and later versions via the boot menu "low resolution video" option and per-application compatibility mode settings, despite newer versions of Windows now defaulting to and generally not allowing any resolution below to be set.
The need for such a low-quality, universally compatible fallback has diminished since the turn of the millennium, as VGA-signalling-standard screens or adaptors unable to show anything beyond the original resolutions have become increasingly rare.
at 70 Hz was the most common mode for VGA-era PC games, with pixel-doubling and line-doubling performed in hardware to present a at 70 Hz signal to the monitor.
The Windows 95/98/Me LOGO.SYS boot-up image was 320 × 400 resolution, displayed with pixel-doubling to present a at 70 Hz signal to the monitor. The 400-line signal was the same as the standard text mode, which meant that pressing to return to text mode didn't change the frequency of the video signal, and thus the monitor did not have to resynchronize (which could otherwise have taken several seconds).
See also: VGA connector.
The standard VGA monitor interface is a 15-pin D-subminiature connector in the "E" shell, variously referred to as "DE-15", "HD-15" and erroneously "DB-15(HD)".
Because VGA uses low-voltage analog signals, signal degradation becomes a factor with low-quality or overly long cables. Solutions include shielded cables, cables that include a separate internal coaxial cable for each color signal, and "broken out" cables utilizing a separate coaxial cable with a BNC connector for each color signal.
BNC breakout cables typically use five connectors, one each for Red, Green, Blue, Horizontal Sync, and Vertical Sync, and do not include the other signal lines of the VGA interface. With BNC, the coaxial wires are fully shielded end-to-end and through the interconnect so that virtually no crosstalk and very little external interference can occur. The use of BNC RGB video cables predates VGA in other markets and industries.
See also: List of monochrome and RGB color formats and List of 16-bit computer color palettes.
The VGA color system uses register-based palettes to map colors in various bit depths to its 18-bit output gamut. It is backward compatible with the EGA and CGA adapters, but supports extra bit depth for the palette when in these modes.
For instance, when in EGA 16-color modes, VGA offers 16 palette registers, and in 256-color modes, it offers 256 registers.[27] Each palette register contain a 3×6 bit RGB value, selecting a color from the 18-bit gamut of the DAC.
These color registers are initialized to default values IBM expected to be most useful for each mode. For instance, EGA 16-color modes initialize to the default CGA 16-color palette, and the 256-color mode initializes to a palette consisting of 16 CGA colors, 16 grey shades, and then 216 colors chosen by IBM to fit expected use cases.[28] [29] After initialization they can be redefined at any time without altering the contents of video RAM, permitting palette cycling.
In the 256-color modes, the DAC is set to combine four 2-bit color values, one from each plane, into an 8-bit-value representing an index into the 256-color palette. The CPU interface combines the 4 planes in the same way, a feature called "chain-4", so that each pixel appears to the CPU as a packed 8-bit value representing the palette index.[30]
The video memory of the VGA is mapped to the PC's memory via a window in the range between segments 0xA0000 and 0xBFFFF in the PC's real mode address space (A000:0000 and B000:FFFF in segment:offset notation). Typically, these starting segments are:
A typical VGA card is also provide this port-mapped I/O segment:
Due to the use of different address mappings for different modes, it is possible to have a monochrome adapter (i.e. MDA or Hercules) and a color adapter such as the VGA, EGA, or CGA installed in the same machine.
At the beginning of the 1980s, this was typically used to display Lotus 1-2-3 spreadsheets in high-resolution text on a monochrome display and associated graphics on a low-resolution CGA display simultaneously. Many programmers also used such a setup with the monochrome card displaying debugging information while a program ran in graphics mode on the other card. Several debuggers, like Borland's Turbo Debugger, D86 and Microsoft's CodeView could work in a dual monitor setup. Either Turbo Debugger or CodeView could be used to debug Windows.
There were also device drivers such as ox.sys
, which implemented a serial interface simulation on the monochrome display and, for example, allowed the user to receive crash messages from debugging versions of Windows without using an actual serial terminal.
It is also possible to use the "MODE MONO" command at the command prompt to redirect the output to the monochrome display. When a monochrome adapter was not present, it was possible to use the 0xB000–0xB7FF address space as additional memory for other programs.
A VGA-capable PCI / PCIe graphics card can provide legacy VGA registers in its PCI configuration space, which may be remapped by BIOS or operating system.[31]
"Unchaining" the 256 KB VGA memory into four separate "planes" makes VGA's 256 KB of RAM available in 256-color modes. There is a trade-off for extra complexity and performance loss in some types of graphics operations, but this is mitigated by other operations becoming faster in certain situations:
Software such as Fractint, Xlib and ColoRIX also supported tweaked 256-color modes on standard adaptors using freely-combinable widths of 256, 320, and 360 pixels and heights of 200, 240 and 256 (or 400, 480 and 512) lines, extending still further to 384 or 400 pixel columns and 576 or 600 (or 288, 300). However, was the best known and most frequently used, as it offered a standard 40-column resolution and 4:3 aspect ratio with square pixels. " × 8" resolution was commonly called Mode X, the name used by Michael Abrash when he presented the resolution in Dr. Dobb's Journal.
The highest resolution modes were only used in special, opt-in cases rather than as standard, especially where high line counts were involved. Standard VGA monitors had a fixed line scan (H-scan) rate"multisync" monitors being, at the time, expensive raritiesand so the vertical/frame (V-scan) refresh rate had to be reduced in order to accommodate them, which increased visible flicker and thus eye strain. For example, the highest mode, being otherwise based on the matching SVGA resolution (with 628 total lines), reduced the refresh rate from 60 Hz to about 50 Hz (and, the theoretical maximum resolution achievable with 256 KB at 16 colors, would have reduced it to about 48 Hz, barely higher than the rate at which XGA monitors employed a double-frequency interlacing technique to mitigate full-frame flicker).
These modes were also outright incompatible with some monitors, producing display problems such as picture detail disappearing into overscan (especially in the horizontal dimension), vertical roll, poor horizontal sync or even a complete lack of picture depending on the exact mode attempted. Due to these potential issues, most VGA tweaks used in commercial products were limited to more standards-compliant, "monitor-safe" combinations, such as (square pixels, three video pages, 60 Hz), (double resolution, two video pages, 70 Hz), and (highest resolution compatible with both standard VGA monitors and cards, one video page, 60 Hz) in 256 colors, or double the horizontal resolution in 16-color mode.
Several companies produced VGA compatible graphic board models.[32]
Graphics Solution Plus, Wonder series, Mach series
S3 911, 911A, 924, 801, 805, 805i, 928, 805p, 928p, S3 Vision series, S3 Trio series
MAGIC RGB
Colorplus
PEGA 1, PEGA 1a, PEGA 2a
ET3000, ET4000, ET6000
CL-GD400, CL-GD500 and CL-GD5000 series
TVGA 8000 series, TVGA 9000 series, TGUI9000 series
See main article: Super VGA. Super VGA (SVGA) is a display standard developed in 1988, when NEC Home Electronics announced its creation of the Video Electronics Standards Association (VESA). The development of SVGA was led by NEC, along with other VESA members including ATI Technologies and Western Digital. SVGA enabled graphics display resolutions up to 800 × 600 pixels, 36% more than VGA's maximum resolution of 640 × 480 pixels.[33]
See main article: Extended Graphics Array.
Extended Graphics Array (XGA) is an IBM display standard introduced in 1990. Later it became the most common appellation of the 1024 × 768 pixels display resolution.