Computer simulation explained

Computer simulation is the process of mathematical modelling, performed on a computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they aim to predict. Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in economics, psychology, social science, health care and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions.[1]

Computer simulations are realized by running computer programs that can be either small, running almost instantly on small devices, or large-scale programs that run for hours or days on network-based groups of computers. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. In 1997, a desert-battle simulation of one force invading another involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer Modernization Program.[2] Other examples include a 1-billion-atom model of material deformation;[3] a 2.64-million-atom model of the complex protein-producing organelle of all living organisms, the ribosome, in 2005;[4] a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL (Switzerland), begun in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level.[5]

Because of the computational cost of simulation, computer experiments are used to perform inference such as uncertainty quantification.[6]

Simulation versus model

A model consists of the equations used to capture the behavior of a system. By contrast, computer simulation is the actual running of the program that perform algorithms which solve those equations, often in an approximate manner. Simulation, therefore, is the process of running a model. Thus one would not "build a simulation"; instead, one would "build a model (or a simulator)", and then either "run the model" or equivalently "run a simulation".

History

Computer simulation developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation. It was a simulation of 12 hard spheres using a Monte Carlo algorithm. Computer simulation is often used as an adjunct to, or substitute for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; their common feature is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible.[7]

Data preparation

The external data requirements of simulations and models vary widely. For some, the input might be just a few numbers (for example, simulation of a waveform of AC electricity on a wire), while others might require terabytes of information (such as weather and climate models).

Input sources also vary widely:

Lastly, the time at which data is available varies:

Because of this variety, and because diverse simulation systems have many common elements, there are a large number of specialized simulation languages. The best-known may be Simula. There are now many others.

Systems that accept data from external sources must be very careful in knowing what they are receiving. While it is easy for computers to read in values from text or binary files, what is much harder is knowing what the accuracy (compared to measurement resolution and precision) of the values are. Often they are expressed as "error bars", a minimum and maximum deviation from the value range within which the true value (is expected to) lie. Because digital computer mathematics is not perfect, rounding and truncation errors multiply this error, so it is useful to perform an "error analysis"[8] to confirm that values output by the simulation will still be usefully accurate.

Types

Models used for computer simulations can be classified according to several independent pairs of attributes, including:

Another way of categorizing models is to look at the underlying data structures. For time-stepped simulations, there are two main classes:

For steady-state simulations, equations define the relationships between elements of the modeled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems, as a simpler modeling case before dynamic simulation is attempted.

Visualization

Formerly, the output data from a computer simulation was sometimes presented in a table or a matrix showing how data were affected by numerous changes in the simulation parameters. The use of the matrix format was related to traditional use of the matrix concept in mathematical models. However, psychologists and others noted that humans could quickly perceive trends by looking at graphs or even moving-images or motion-pictures generated from the data, as displayed by computer-generated-imagery (CGI) animation. Although observers could not necessarily read out numbers or quote math formulas, from observing a moving weather chart they might be able to predict events (and "see that rain was headed their way") much faster than by scanning tables of rain-cloud coordinates. Such intense graphical displays, which transcended the world of numbers and formulae, sometimes also led to output that lacked a coordinate grid or omitted timestamps, as if straying too far from numeric data displays. Today, weather forecasting models tend to balance the view of moving rain/snow clouds against a map that uses numeric coordinates and numeric timestamps of events.

Similarly, CGI computer simulations of CAT scans can simulate how a tumor might shrink or change during an extended period of medical treatment, presenting the passage of time as a spinning view of the visible human head, as the tumor changes.

Other applications of CGI computer simulations are being developed to graphically display large amounts of data, in motion, as changes occur during a simulation run.

In science

Generic examples of types of computer simulations in science, which are derived from an underlying mathematical description:

Specific examples of computer simulations include:

Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows' World3 used in the Limits to Growth, James Lovelock's Daisyworld and Thomas Ray's Tierra.

In social sciences, computer simulation is an integral component of the five angles of analysis fostered by the data percolation methodology,[12] which also includes qualitative and quantitative methods, reviews of the literature (including scholarly), and interviews with experts, and which forms an extension of data triangulation. Of course, similar to any other scientific method, replication is an important part of computational modeling [13]

In practical contexts

Computer simulations are used in a wide variety of practical contexts, such as:

The reliability and the trust people put in computer simulations depends on the validity of the simulation model, therefore verification and validation are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention in stochastic simulations, where random numbers should actually be semi-random numbers. An exception to reproducibility are human-in-the-loop simulations such as flight simulations and computer games. Here a human is part of the simulation and thus influences the outcome in a way that is hard, if not impossible, to reproduce exactly.

Vehicle manufacturers make use of computer simulation to test safety features in new designs. By building a copy of the car in a physics simulation environment, they can save the hundreds of thousands of dollars that would otherwise be required to build and test a unique prototype. Engineers can step through the simulation milliseconds at a time to determine the exact stresses being put upon each section of the prototype.[15]

Computer graphics can be used to display the results of a computer simulation. Animations can be used to experience a simulation in real-time, e.g., in training simulations. In some cases animations may also be useful in faster than real-time or even slower than real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization.

In debugging, simulating a program execution under test (rather than executing natively) can detect far more errors than the hardware itself can detect and, at the same time, log useful debugging information such as instruction trace, memory alterations and instruction counts. This technique can also detect buffer overflow and similar "hard to detect" errors as well as produce performance information and tuning data.

Pitfalls

Although sometimes ignored in computer simulations, it is very important to perform a sensitivity analysis to ensure that the accuracy of the results is properly understood. For example, the probabilistic risk analysis of factors determining the success of an oilfield exploration program involves combining samples from a variety of statistical distributions using the Monte Carlo method. If, for instance, one of the key parameters (e.g., the net ratio of oil-bearing strata) is known to only one significant figure, then the result of the simulation might not be more precise than one significant figure, although it might (misleadingly) be presented as having four significant figures.

See also

Further reading

External links

Notes and References

  1. Book: Strogatz , Steven . The End of Insight . 2007 . What is your dangerous idea? . Brockman . John . HarperCollins . 9780061214950 .
  2. Web site: dead . Researchers stage largest Military Simulation ever . https://web.archive.org/web/20080122123958/http://www.jpl.nasa.gov/releases/97/military.html . 2008-01-22 . . . December 4, 1997.
  3. Web site: Molecular Simulation of Macroscopic Phenomena. IBM Research - Almaden . dead . https://web.archive.org/web/20130522082737/http://www.almaden.ibm.com/st/past_projects/fractures/. 2013-05-22.
  4. Web site: dead . Largest computational biology simulation mimics life's most essential nanomachine . Nancy . Ambrosiano . . Los Alamos, NM . October 19, 2005 . https://web.archive.org/web/20070704061957/http://www.lanl.gov/news/index.php/fuseaction/home.story/story_id/7428 . 2007-07-04 .
  5. Web site: live . Mission to build a simulated brain begins . https://web.archive.org/web/20150209125048/http://www.newscientist.com/article/dn7470.html . 2015-02-09 . . June 6, 2005 . Duncan . Graham-Rowe .
  6. Book: Santner, Thomas J. Williams, Brian J. Notz, William I. The design and analysis of computer experiments. 2003. Springer Verlag.
  7. Book: A Guide to Simulation. Bratley. Paul. Fox. Bennet L.. Schrage. Linus E.. 2011-06-28. Springer Science & Business Media. 9781441987242. en.
  8. Book: An Introduction to Error Analysis: The Study of Uncertainties in Physical Measurements . John Robert Taylor . 128–129 . 978-0-935702-75-0 . 1999 . University Science Books . live . https://web.archive.org/web/20150316103343/http://books.google.com/books?id=giFQcZub80oC&pg=PA128 . 2015-03-16 .
  9. Gupta. Ankur. Rawlings. James B.. April 2014. Comparison of Parameter Estimation Methods in Stochastic Chemical Kinetic Models: Examples in Systems Biology. AIChE Journal . 60. 4. 1253–1268. 10.1002/aic.14409. 0001-1541. 4946376. 27429455. 2014AIChE..60.1253G .
  10. 26281720 . 10.1016/j.biotechadv.2015.08.001 . 33 . 8 . Discovery and resupply of pharmacologically active plant-derived natural products: A review . 4748402 . 2015 . Biotechnol Adv . 1582–614 . Atanasov . AG . Waltenberger . B . Pferschy-Wenzig . EM . Linder . T . Wawrosch . C . Uhrin . P . Temml . V . Wang . L . Schwaiger . S . Heiss . EH . Rollinger . JM . Schuster . D . Breuss . JM . Bochkov . V . Mihovilovic . MD . Kopp . B . Bauer . R . Dirsch . VM . Stuppner . H.
  11. Mizukami, Koichi; Saito, Fumio; Baron, Michel. Study on grinding of pharmaceutical products with an aid of computer simulation
  12. Mesly, Olivier(2015). Creating Models in Psychological Research. United States: Springer Psychology: 126 pages.
  13. Wilensky . Uri . Rand . William . 2007 . Making Models Match: Replicating an Agent-Based Model . Journal of Artificial Societies and Social Simulation . 10 . 4. 2 .
  14. Book: Wescott, Bob . The Every Computer Performance Book, Chapter 7: Modeling Computer Performance . . 2013 . 978-1482657753 .
  15. Baase, Sara. A Gift of Fire: Social, Legal, and Ethical Issues for Computing and the Internet. 3. Upper Saddle River: Prentice Hall, 2007. Pages 363–364. .