First-person shooter engine explained

A first-person shooter engine is a video game engine specialized for simulating 3D environments for use in a first-person shooter video game. First-person refers to the view where the players see the world from the eyes of their characters. Shooter refers to games which revolve primarily around wielding firearms and killing other entities in the game world, either non-player characters or other players.

The development of the FPS graphic engines is characterized by a steady increase in technologies, with some breakthroughs. Attempts at defining distinct generations lead to arbitrary choices of what constitutes a highly modified version of an 'old engine' and what is a new engine.

The classification is complicated as game engines blend old and new technologies. Features considered advanced in a new game one year, become the expected standard the next year. Games with a combination of both older and newer features are the norm. For example, (1998) introduced physics to the FPS genre, which did not become common until around 2002. Red Faction (2001) featured a destructible environment, something still not common in engines years later.

Timeline

1970s and 1980s: Early FPS graphics engines

Game rendering for this early generation of FPS were already from the first-person perspective and with the need to shoot things, however they were mostly made up using vector graphics.

There are two possible claimants for the first FPS, Maze War and Spasim.[1] Maze War was developed in 1973 and involved a single player making his way through a maze of corridors rendered using a fixed perspective. Multiplayer capabilities, where players attempted to shoot each other, were added later and were networked in 1974. Spasim was originally developed in 1974 and involved players moving through a wire-frame 3D universe. Spasim could be played by up to 32 players on the PLATO network.[2]

Developed in-house by Incentive Software, the Freescape engine is considered to be one of the first proprietary 3D engines to be used for computer games, although the engine was not used commercially outside of Incentive's own titles. The first game to use this engine was the puzzle game Driller in 1987.[3]

Early 1990s: Wireframes to 2.5D worlds and textures

Games of this generation often had "3D" in their names but were not capable of full 3D rendering. Instead, they used ray casting 2.5D techniques to create a seemingly 3D environment from a 2D map, and flat sprites to draw enemies instead of 3D models. These games also began to use textures for environmental geometry instead of simple wire-frame models or solid colors.

Hovertank 3D, from id Software, was the first to use this technique in 1990, but was still not using textures, a capability which was added shortly after on Catacomb 3D (1991), then with the Wolfenstein 3D engine which was later used for several other games. Catacomb 3D was also the first game to show the player character's hand on-screen, furthering the implication of the player into the character's role.

Wolfenstein 3D engine was still very primitive. It did not apply textures to the floor and ceiling, and the ray casting restricted walls to a fixed height, and levels were all on the same plane.

Even though it was still not using true 3D, id Tech 1, first used in Doom (1993) and again from id Software, removed these limitations. It also first introduced the concept of binary space partitioning (BSP). Another breakthrough was the introduction of multiplayer abilities in the engine. However, because it was still using 2.5D, it was impossible to look up and down properly in Doom, and all Doom levels were actually two-dimensional.[4] Due to the lack of a z-axis, the engine did not allow for room-over-room support.

Dooms success spawned several games using the same engine or similar techniques, giving rise to the term Doom clones. The Build engine, used in Duke Nukem 3D (1996), later removed some of the limitations of id Tech 1, such as the Build engine being able to have support for room-over-room by stacking sectors on top of sectors, but the techniques used remained the same.

Mid 1990s: 3D models, beginnings of hardware acceleration

In the mid-1990s, game engines recreated true 3D worlds with arbitrary level geometry. Instead of sprites the engines used simply textured (single-pass texturing, no lighting details) polygonal objects.

FromSoftware released King's Field, a full polygon free roaming first-person real-time action title for the Sony PlayStation in December 1994. Sega's 32X release Metal Head was a first-person shooter mecha simulation game that used fully texture-mapped, 3D polygonal graphics. A year prior, Exact released the Sharp X68000 computer game Geograph Seal, a fully 3D polygonal first-person shooter that employed platform game mechanics and had most of the action take place in free-roaming outdoor environments rather than the corridor labyrinths of Wolfenstein 3D. The following year, Exact released its successor for the PlayStation console, Jumping Flash!, which used the same game engine but adapted it to place more emphasis on the platforming rather than the shooting. The Jumping Flash! series continued to use the same engine.[5] [6]

Dark Forces, released in 1995 by LucasArts, has been regarded as one of the first "true 3-D" first-person shooter games. Its engine, the Jedi Engine, was one of the first engines to support an environment in three dimensions: areas can exist next to each other in all three planes, including on top of each other (such as stories in a building). Though most of the objects in Dark Forces are sprites, the game does include support for textured 3D-rendered objects. Another game regarded as one of the first true 3D first-person shooter is Parallax Software's 1994 shooter Descent.

The Quake engine (Quake, 1996) used fewer animated sprites and used true 3D geometry and lighting, using elaborate techniques such as z-buffering to speed up the rendering. Levels in Quake and some subsequent engines are made with geometry objects called brushes, which allow for map construction in three dimensions, rather than 2D maps projected in 3D, as Doom had done. Quake was also the first true-3D game to use a special map design system to preprocess and pre-render the 3D environment: the 3D environment in which the game took place (referred for the first time as a Map) was simplified during the creation of the map to reduce the processing required when playing the game.

Static lightmaps and 3D light sources were also "baked" at render time and added to the BSP files storing the levels. These features allowing for more realistic lighting than had previously been possible.

The first Graphics processing units[7] appeared in the late 1990s, but many games still supported software rendering at that time. id Tech 2 (Quake II, 1997) was one of the first games to take advantage of hardware accelerated graphics[8] (id Software later reworked Quake to add OpenGL support to the game).

GoldSrc, the engine derived from the Quake engine by Valve for Half-Life (1998), added Direct3D support, and a skeletal framework to better render the NPCs,[9] [10] and also greatly improved the NPCs artificial intelligence (AI) compared to the Quake engine.[9]

Late 1990s: Full 32-bit color, and GPUs become standard

This period saw the introduction of the first video cards with Transform, clipping, and lighting (T&L). The first card with this innovative technology was the GeForce 256. This card was superior to what 3dfx had to offer at the time, namely Voodoo3, which only fell short because the lack of T&L. Companies such as Matrox with their G400, and S3 with their Savage4 were forced to withdraw from the 3D gaming market during this time period. One year later, ATI released their Radeon 7200, a true competing graphics card line.

While all games of this period supported 16-bit color, many were adopting 32-bit color (really 24-bit color with an 8-bit alpha channel) as well. Soon, many benchmark sites began touting 32-bit as a standard. The Unreal Engine, used in a large number of FPS games since its release, was an important milestone at the time.[11] It used the Glide API, specifically developed for 3dfx GPUs,[10] instead of OpenGL. Probably the biggest reason for its popularity was that the engine architecture and the inclusion of a scripting language made it easy to mod it.[12] [13] One other improvement of Unreal compared to the previous generation of engines was its networking technology, which greatly improved the scalability of the engine on multiplayer.[14]

id Tech 3, first used for Quake III Arena, improved from its predecessor by allowing to store much more complex and smoother animations. It also had improved lighting and shadowing and introduced shaders and curved surfaces.[15]

Early 2000s: Increasing detail, outdoor environments, and rag-doll physics

New graphics hardware provided new capabilities, allowing new engines to add various new effects, such as particle effects or fog, as well as increase texture and polygon detail. Many games featured large outdoor environments, vehicles, and rag-doll physics.

Average Video Hardware requirements: a GPU with hardware T&L such as the DirectX 7.0 GeForce 2 or Radeon 7200 was typically required. The next-generation GeForce 3 or Radeon 8500 were recommended due to their more efficient architecture, though their DirectX 8.0 vertex and pixel shaders were of little use. A handful of games still supported DirectX 6.0 chipsets such as RIVA TNT2 and Rage 128, and software rendering (with an integrated Intel GMA), though this was apparent that even a powerful CPU could not compensate for the lack of hardware T&L.

Games engines originally developed for the PC platform, like the Unreal Engine 2, started to be adapted for sixth generation consoles like PlayStation 2 or GameCube, those now having the computer power to handle graphic-intensive video games.

Mid 2000s: Lighting and pixel shaders, physics

The new generation of graphics chips allowed pixel shader-based textures, bump mapping, and lighting and shadowing technologies to become common. Shader technologies included HLSL (for DirectX), GLSL (for OpenGL), or Cg.

This resulted in the obsolescence of DirectX 7.0 graphics chips such as the widespread GeForce 2 and Radeon 7200, as well as DirectX 6.0 chipsets such as RIVA TNT2 and Rage 128, and integrated on-board graphics accelerators. Until this generation of games, a powerful CPU was able to somewhat compensate for an older video card. Average Video Hardware requirements: minimum was a GeForce 3 or Radeon 8500, strongly recommended was the GeForce FX, Radeon 9700 (or other cards with Pixel shader 2.x support). The Radeon 9700 demonstrated that anti-aliasing (AA) and/or anisotropic filtering (AF) could be fully usable options, even in the newest and most demanding titles at the time, and resulted in the widespread acceptance of AA and AF as standard features. AA and AF had been supported by many earlier graphics chips prior to this but carried a heavy performance hit and so most gamers opted not to enable these features.

With these new technologies game engines featured seamlessly integrated indoor/outdoor environments, used shaders for more realistic animations (characters, water, weather effects, etc.), and generally increased realism. The fact that the GPU performed some of the tasks that were already done by the CPU, and more generally the increasing processing power available, allowed to add realistic physics effects to the games, for example with the inclusion of the Havok physics engine in most video games.[16] Physics had been already added in a video game in 1998 with , but limited hardware capabilities at the time, and the absence of a middleware like Havok to handle physics had made it a technical and commercial failure.[17]

id Tech 4, first used for Doom 3 (2004), used an entirely dynamic per-pixel lighting, whereas previously, 3D engines had relied primarily on pre-calculated per-vertex lighting or lightmaps and Gouraud shading. The Shadow volume approach used in Doom 3 permitted more realistic lighting and shadows,[18] but this came at a price as it could not render soft shadows, and the engine was primarily good indoors. This was later rectified to work with vast outdoor spaces, with the introduction of MegaTexture technology in the id Tech 4 engine.

The same year, Valve released Half-Life 2, powered by their new Source engine. This new engine was notable in that, among other things, it had very realistic facial animations for NPCs, including what was described as an impressive lip-syncing technology.[19]

Late 2000s: The approach to Photorealism

Further improvements in GPUs like Shader Model 3 and Shader Model 4, made possible by new graphic chipsets as GeForce 7 or Radeon X1xxx series, allowed for improvements in graphic effects.

Developers of this era of 3D engines often tout their increasingly photorealistic quality. Around the same time, esports were beginning to gain attention. These engines include realistic shader-based materials with predefined physics, environments with procedural and vertex shader-based objects (vegetation, debris, human-made objects such as books or tools), procedural animation, cinematographic effects (depth of field, motion blur, etc.), high-dynamic-range rendering, and unified lighting models with soft shadowing and volumetric lighting.

However, most of engines capable of these effects are evolutions of engines from the previous generation, such as Unreal Engine 3, the Dunia Engine and CryEngine 2, id Tech 5 (which was used with Rage and makes use of the new Virtual Texturing technology[20]).

The first games using Unreal Engine 3 were released in November 2006, and the first game to use CryEngine 2 (Crysis) was released in 2007.

Early 2010s: Graphic technique mixes

Further improvements in GPUs like Shader Model 5, made possible by new graphic chipsets as GeForce 400 series or Radeon HD 5000 series and later, allowed for improvements in graphic effects. such as Dynamic Displacement Mapping and Tessellation.

As of 2010, two upcoming evolutions of major existing engines had been released: Unreal Engine 3 in DirectX 11 which powered Samaritan Demo[21] (which is used with , and more DX11 based UE3 games) and CryEngine 3, which powers Crysis 2 and 3.

Few companies had discussed future plans for their engines; id Tech 6, the eventual successor to id Tech 5, was an exception. Preliminary information about this engine which was still in early phases of development tended to show that id Software was looking toward a direction where ray tracing and classic raster graphics would be mixed.[22] However, according to John Carmack, the hardware capable of id Tech 6 did not yet exist.[23] The first title using the engine, Doom, was released in mid 2016.

In September 2015, Valve released Source 2 in an update to Dota 2.[24]

See also

Notes and References

  1. Web site: Dharamjit Rihal. The History of First-Person Shooters. 2009-07-04.
  2. Web site: The history of the FPS. A pictorial. 2007-04-11. 2009-07-04.
  3. Web site: Exploring the Freescape. IGN. 2008-10-22. 2009-07-04.
  4. News: Doom to Dunia: A Visual History of 3D Game Engines. Paul Lily. Pcgamer. 2009-07-21. Maximum PC. 2009-07-05.
  5. http://www.the-nextlevel.com/review/retro/geograph-seal-x68000/ Geograph Seal (X68000)
  6. Web site: Jumping Flashback. Travis. Fahs. 4 November 2008. ign.com. 20 April 2018.
  7. like Voodoo, Voodoo 2, or Riva TNT, or later the more powerful DirectX 6.0 chipsets such as Voodoo3, RIVA TNT2 and Rage 128
  8. Web site: id Tech 2 . . 2009-07-05 . dead . https://web.archive.org/web/20091108191715/http://www.idsoftware.com/business/idtech2/ . November 8, 2009.
  9. Web site: half Life: Improved Technology . . 2009-07-08 . dead . https://web.archive.org/web/20110225015808/http://uk.gamespot.com/features/halflife_final/part32.html . 2011-02-25.
  10. News: Doom to Dunia: A Visual History of 3D Game Engines. Paul Lily. Pcgamer. 2009-07-21. Maximum PC. 2009-07-05.
  11. Web site: History of Unreal - Part 1. 2005-05-31. beyondunreal.com. 2009-08-05.
  12. Web site: History of Unreal - Part 1. 2005-05-31. Probably the biggest draw to Unreal was the ability to mod it. Tim Sweeney (Founder of Epic) wrote a simple scripting engine into the game called UnrealScript.. beyondunreal.com. 2009-07-05.
  13. Web site: Introduction to Unreal Technology. 2009-07-21. InformIT. 2009-08-08.
  14. Web site: Network. 1999-07-21. Epic Games. 2009-08-08. 2010-07-28. https://web.archive.org/web/20100728233924/http://unreal.epicgames.com/Network.htm. dead.
  15. News: Doom to Dunia: A Visual History of 3D Game Engines. Paul Lily. Pcgamer. 2009-07-21. Maximum PC. 2009-07-05.
  16. Web site: Playing Dead: Physics in Pop Games. 2007. hlhmod.com. 2009-08-09. https://web.archive.org/web/20090401042848/http://www.hlhmod.com/physics.html. 2009-04-01. dead.
  17. Web site: Postmortem: DreamWorks Interactive's Trespasser. 1999-05-14. Gamasutra. 2009-08-09.
  18. Web site: Doom 3. ixbtlabs.com. The main advantage of the new system of lighting (besides the mentioned direct control of an artist over its masterpiece) is the capacity to render shadows in real time for every frame (...) Secondly, it's very hard to render muzzy, "soft" shadows prevailing in reality using shadow volumes. (...) Thirdly, summing up the two previous paragraphs we draw a conclusion that shadow volumes do not fit well for rendering shadows at vast open spaces.. 2009-08-09.
  19. Web site: Half-Life 2. 2004-11-14. Eurogamer. But yet the incredibly lifelike detail and unparalleled attention to detail in the facial and body animation bring the characters to life like no game has ever even come close to doing. Six years ago there were a handful of facial models, bags of imagination and some great voice work; now we've got a huge cast list who all have plenty to say (with impressively accurate dynamic lip synching) and do so with such an impressive array of visible emotions that infuse the game with a head-turning credibility that will change the way people view games forever.. 2009-08-09.
  20. Web site: From Texture Virtualization to Massive Parallelization. August 2009. Id Software. https://web.archive.org/web/20091007031619/http://s09.idav.ucdavis.edu/talks/05-JP_id_Tech_5_Challenges.pdf. 2009-10-07. dead. 2009-07-07.
  21. Web site: Unreal Engine 3: Official Samaritan Demo. https://ghostarchive.org/varchive/youtube/20211222/RSXyztq_0uM . 2021-12-22 . live. IGN. 8 March 2011. 20 April 2018. YouTube.
  22. Web site: PC Perspective. John Carmack on id Tech 6, Ray Tracing, Consoles, Physics and more. 2008-03-12. What John does see ray tracing useful for is a very specific data model he has created called "sparse voxel octrees" that allow him to store immense amounts of data in a fashion that is easily accessed using ray tracing methods(...) This new data model and algorithm being worked on for id Tech 6 would allow, according to John, nearly infinite amounts of geometric detail in the world without the problems seen with tessellation engines or trying to store gigabytes of data locally.. 2010-03-27. https://web.archive.org/web/20100314045408/http://www.pcper.com/article.php?aid=532. 2010-03-14. dead.
  23. News: Maximum PC. QuakeCon 08: id Tech 6 Will Utilin Carmack Interview. Rage, id Tech 6, Doom 4 Details, and More!. Pcgamer . 2008-07-15. I still think there’s one more generation to be had where we virtualize geometry with id Tech 6 and do some things that are truly revolutionary. (...) I know we can deliver a next-gen kick, if we can virtualize the geometry like we virtualized the textures; we can do things that no one's ever seen in games before..
  24. Web site: Dota 2 - Reborn. Dota2.com. 2016-06-23.