Nuclear winter is a severe and prolonged global climatic cooling effect that is hypothesized[1] to occur after widespread firestorms following a large-scale nuclear war.[2] The hypothesis is based on the fact that such fires can inject soot into the stratosphere, where it can block some direct sunlight from reaching the surface of the Earth. It is speculated that the resulting cooling would lead to widespread crop failure and famine.[3] [4] When developing computer models of nuclear-winter scenarios, researchers use the conventional bombing of Hamburg, and the Hiroshima firestorm in World War II as example cases where soot might have been injected into the stratosphere, alongside modern observations of natural, large-area wildfire-firestorms.[2] [5] [6]
"Nuclear winter", or as it was initially termed, "nuclear twilight", began to be considered as a scientific concept in the 1980s after it became clear that an earlier hypothesis predicting that fireball generated NOx emissions would devastate the ozone layer was losing credibility. It was within this context that the climatic effects of soot from fires became the new focus of the climatic effects of nuclear war.[7] In these model scenarios, various soot clouds containing uncertain quantities of soot were assumed to form over cities, oil refineries, and more rural missile silos. Once the quantity of soot is decided upon by the researchers, the climate effects of these soot clouds are then modeled.[8] The term "nuclear winter" was a neologism coined in 1983 by Richard P. Turco in reference to a one-dimensional computer model created to examine the "nuclear twilight" idea. This model projected that massive quantities of soot and smoke would remain aloft in the air for on the order of years, causing a severe planet-wide drop in temperature.
After the failure of the predictions on the effects of the 1991 Kuwait oil fires that were made by the primary team of climatologists that advocate the hypothesis, over a decade passed without new published papers on the topic. More recently, the same team of prominent modellers from the 1980s have begun again to publish the outputs of computer models. These newer models produce the same general findings as their old ones, namely that the ignition of 100 firestorms, each comparable in intensity to that observed in Hiroshima in 1945, could produce a "small" nuclear winter.[9] These firestorms would result in the injection of soot (specifically black carbon) into the Earth's stratosphere, producing an anti-greenhouse effect that would lower the Earth's surface temperature. The severity of this cooling in Alan Robock's model suggests that the cumulative products of 100 of these firestorms could cool the global climate by approximately 1 °C (1.8 °F), largely eliminating the magnitude of anthropogenic global warming for the next roughly two or three years.[10] Robock and his collaborators have modeled the effect on global food production, and project that the injection of more than 5 Tg of soot into the stratosphere would lead to mass food shortages persisting for several years. According to their model, livestock and aquatic food production would be unable to compensate for reduced crop output in almost all countries, and adaptation measures such as food waste reduction would have limited impact on increasing available calories.[11] [12]
As nuclear devices need not be detonated to ignite a firestorm, the term "nuclear winter" is something of a misnomer. The majority of papers published on the subject state that without qualitative justification, nuclear explosions are the cause of the modeled firestorm effects. The only phenomenon that is modeled by computer in the nuclear winter papers is the climate forcing agent of firestorm-soot, a product which can be ignited and formed by a myriad of means. Although rarely discussed, the proponents of the hypothesis state that the same "nuclear winter" effect would occur if 100 large scale conventional firestorms were ignited.[13]
A much larger number of firestorms, in the thousands, was the initial assumption of the computer modelers who coined the term in the 1980s. These were speculated to be a possible result of any large scale employment of counter-value airbursting nuclear weapon use during an American-Soviet total war. This larger number of firestorms, which are not in themselves modeled, are presented as causing nuclear winter conditions as a result of the smoke inputted into various climate models, with the depths of severe cooling lasting for as long as a decade. During this period, summer drops in average temperature could be up to 20 °C (36 °F) in core agricultural regions of the US, Europe, and China, and as much as 35 °C (63 °F) in Russia.[14] This cooling would be produced due to a 99% reduction in the natural solar radiation reaching the surface of the planet in the first few years, gradually clearing over the course of several decades.
On the fundamental level, since the advent of photographic evidence of tall clouds were captured,[15] it was known that firestorms could inject soot smoke/aerosols into the stratosphere, but the longevity of this slew of aerosols was a major unknown. Independent of the team that continue to publish theoretical models on nuclear winter, in 2006, Mike Fromm of the Naval Research Laboratory, experimentally found that each natural occurrence of a massive wildfire firestorm, much larger than that observed at Hiroshima, can produce minor "nuclear winter" effects, with short-lived, approximately one month of a nearly immeasurable drop in surface temperatures, confined to the hemisphere that they burned in.[16] [17] [18] This is somewhat analogous to the frequent volcanic eruptions that inject sulfates into the stratosphere and thereby produce minor, even negligible, volcanic winter effects.
A suite of satellite and aircraft-based firestorm-soot-monitoring instruments are at the forefront of attempts to accurately determine the lifespan, quantity, injection height, and optical properties of this smoke.[19] [20] [21] [22] [23] Information regarding all of these properties is necessary to truly ascertain the length and severity of the cooling effect of firestorms, independent of the nuclear winter computer model projections.
Currently, from satellite tracking data, it appears that stratospheric smoke aerosols dissipate in a time span under approximately two months. The existence of a tipping point into a new stratospheric condition where the aerosols would not be removed within this time frame remains to be determined.
See main article: Pyrocumulonimbus cloud. The nuclear winter scenario assumes that 100 or more city firestorms[24] [25] are ignited by nuclear explosions, and that the firestorms lift large amounts of sooty smoke into the upper troposphere and lower stratosphere by the movement offered by the pyrocumulonimbus clouds that form during a firestorm. At 10- above the Earth's surface, the absorption of sunlight could further heat the soot in the smoke, lifting some or all of it into the stratosphere, where the smoke could persist for years if there is no rain to wash it out. This aerosol of particles could heat the stratosphere and prevent a portion of the sun's light from reaching the surface, causing surface temperatures to drop drastically. In this scenario it is predicted that surface air temperatures would be the same as, or colder than, a given region's winter for months to years on end.
The modeled stable inversion layer of hot soot between the troposphere and high stratosphere that produces the anti-greenhouse effect was dubbed the "Smokeosphere" by Stephen Schneider et al. in their 1988 paper.[1] [26]
Although it is common in the climate models to consider city firestorms, these need not be ignited by nuclear devices; more conventional ignition sources can instead be the spark of the firestorms. Prior to the previously mentioned solar heating effect, the soot's injection height is controlled by the rate of energy release from the firestorm's fuel, not the size of an initial nuclear explosion. For example, the mushroom cloud from the bomb dropped on Hiroshima reached a height of six kilometers (middle troposphere) within a few minutes and then dissipated due to winds, while the individual fires within the city took almost three hours to form into a firestorm and produce a pyrocumulus cloud, a cloud that is assumed to have reached upper tropospheric heights, as over its multiple hours of burning, the firestorm released an estimated 1000 times the energy of the bomb.
As the incendiary effects of a nuclear explosion do not present any especially characteristic features, it is estimated by those with strategic bombing experience that as the city was a firestorm hazard, the same fire ferocity and building damage produced at Hiroshima by one 16-kiloton nuclear bomb from a single B-29 bomber could have been produced instead by the conventional use of about 1.2 kilotons of incendiary bombs from 220 B-29s distributed over the city.[27] [28]
While the firestorms of Dresden and Hiroshima and the mass fires of Tokyo and Nagasaki occurred within mere months in 1945, the more intense and conventionally lit Hamburg firestorm occurred in 1943. Despite the separation in time, ferocity and area burned, leading modelers of the hypothesis state that these five fires potentially placed five percent as much smoke into the stratosphere as the hypothetical 100 nuclear-ignited fires discussed in modern models.[13] While it is believed that the modeled climate-cooling-effects from the mass of soot injected into the stratosphere by 100 firestorms (one to five million metric tons) would have been detectable with technical instruments in WWII, five percent of that would not have been possible to observe at that time.[13]
The exact timescale for how long this smoke remains, and thus how severely this smoke affects the climate once it reaches the stratosphere, is dependent on both chemical and physical removal processes.
The most important physical removal mechanism is "rainout", both during the "fire-driven convective column" phase, which produces "black rain" near the fire site, and rainout after the convective plume's dispersal, where the smoke is no longer concentrated and thus "wet removal" is believed to be very efficient. However, these efficient removal mechanisms in the troposphere are avoided in the Robock 2007 study, where solar heating is modeled to quickly loft the soot into the stratosphere, "detraining" or separating the darker soot particles from the fire clouds' whiter water condensation.
Once in the stratosphere, the physical removal mechanisms affecting the timescale of the soot particles' residence are how quickly the aerosol of soot collides and coagulates with other particles via Brownian motion,[29] and falls out of the atmosphere via gravity-driven dry deposition, and the time it takes for the "phoretic effect" to move coagulated particles to a lower level in the atmosphere.[8] Whether by coagulation or the phoretic effect, once the aerosol of smoke particles are at this lower atmospheric level, cloud seeding can begin, permitting precipitation to wash the smoke aerosol out of the atmosphere by the wet deposition mechanism.
The chemical processes that affect the removal are dependent on the ability of atmospheric chemistry to oxidize the carbonaceous component of the smoke, via reactions with oxidative species such as ozone and nitrogen oxides, both of which are found at all levels of the atmosphere,[30] [31] and which also occur at greater concentrations when air is heated to high temperatures.
Historical data on residence times of aerosols, albeit a different mixture of aerosols, in this case stratospheric sulfur aerosols and volcanic ash from megavolcano eruptions, appear to be in the one-to-two-year time scale,[32] however aerosol–atmosphere interactions are still poorly understood.[33] [34]
See also: Tihomir Novakov and Aethalometer. Sooty aerosols can have a wide range of properties, as well as complex shapes, making it difficult to determine their evolving atmospheric optical depth value. The conditions present during the creation of the soot are believed to be considerably important as to their final properties, with soot generated on the more efficient spectrum of burning efficiency considered almost "elemental carbon black," while on the more inefficient end of the burning spectrum, greater quantities of partially burnt/oxidized fuel are present. These partially burnt "organics" as they are known, often form tar balls and brown carbon during common lower-intensity wildfires, and can also coat the purer black carbon particles.[35] [36] [37] However, as the soot of greatest importance is that which is injected to the highest altitudes by the pyroconvection of the firestorm – a fire being fed with storm-force winds of air – it is estimated that the majority of the soot under these conditions is the more oxidized black carbon.[38]
A study presented at the annual meeting of the American Geophysical Union in December 2006 found that even a small-scale, regional nuclear war could disrupt the global climate for a decade or more. In a regional nuclear conflict scenario where two opposing nations in the subtropics would each use 50 Hiroshima-sized nuclear weapons (about 15 kilotons each) on major population centers, the researchers estimated as much as five million tons of soot would be released, which would produce a cooling of several degrees over large areas of North America and Eurasia, including most of the grain-growing regions. The cooling would last for years, and, according to the research, could be "catastrophic",[39] disrupting agricultural production and food gathering in particular in higher latitude countries.[40] [11]
Nuclear detonations produce large amounts of nitrogen oxides by breaking down the air around them. These are then lifted upwards by thermal convection. As they reach the stratosphere, these nitrogen oxides are capable of catalytically breaking down the ozone present in this part of the atmosphere. Ozone depletion would allow a much greater intensity of harmful ultraviolet radiation from the sun to reach the ground.[41]
A 2008 study by Michael J. Mills et al., published in the Proceedings of the National Academy of Sciences, found that a nuclear weapons exchange between Pakistan and India using their current arsenals could create a near-global ozone hole, triggering human health problems and causing environmental damage for at least a decade. The computer-modeled study looked at a nuclear war between the two countries involving 50 Hiroshima-sized nuclear devices on each side, producing massive urban fires and lofting as much as five million metric tons of soot about 50miles into the stratosphere. The soot would absorb enough solar radiation to heat surrounding gases, increasing the breakdown of the stratospheric ozone layer protecting Earth from harmful ultraviolet radiation, with up to 70% ozone loss at northern high latitudes.[42]
A "nuclear summer" is a hypothesized scenario in which, after a nuclear winter caused by aerosols inserted into the atmosphere that would prevent sunlight from reaching lower levels or the surface,[43] has abated, a greenhouse effect then occurs due to carbon dioxide released by combustion and methane released from the decay of the organic matter such as corpses that froze during the nuclear winter.[43] [44]
Another more sequential hypothetical scenario, following the settling out of most of the aerosols in 1–3 years, the cooling effect would be overcome by a heating effect from greenhouse warming, which would raise surface temperatures rapidly by many degrees, enough to cause the death of much if not most of the life that had survived the cooling, much of which is more vulnerable to higher-than-normal temperatures than to lower-than-normal temperatures. The nuclear detonations would release CO2 and other greenhouse gases from burning, followed by more released from the decay of dead organic matter. The detonations would also insert nitrogen oxides into the stratosphere that would then deplete the ozone layer around the Earth.[43]
Other more straightforward hypothetical versions exist of the hypothesis that nuclear winter might give way to a nuclear summer. The high temperatures of the nuclear fireballs could destroy the ozone gas of the middle stratosphere.[44]
In 1952, a few weeks prior to the Ivy Mike (10.4 megaton) bomb test on Elugelab Island, there were concerns that the aerosols lifted by the explosion might cool the Earth. Major Norair Lulejian, USAF, and astronomer Natarajan Visvanathan studied this possibility, reporting their findings in Effects of Superweapons Upon the Climate of the World, the distribution of which was tightly controlled. This report is described in a 2013 report by the Defense Threat Reduction Agency as the initial study of the "nuclear winter" concept. It indicated no appreciable chance of explosion-induced climate change.
The implications for civil defense of numerous surface bursts of high yield hydrogen bomb explosions on Pacific Proving Ground islands such as those of Ivy Mike in 1952 and Castle Bravo (15 Mt) in 1954 were described in a 1957 report on The Effects of Nuclear Weapons, edited by Samuel Glasstone. A section in that book entitled "Nuclear Bombs and the Weather" states: "The dust raised in severe volcanic eruptions, such as that at Krakatoa in 1883, is known to cause a noticeable reduction in the sunlight reaching the earth ... The amount of [soil or other surface] debris remaining in the atmosphere after the explosion of even the largest nuclear weapons is probably not more than about one percent or so of that raised by the Krakatoa eruption. Further, solar radiation records reveal that none of the nuclear explosions to date has resulted in any detectable change in the direct sunlight recorded on the ground."[45] The US Weather Bureau in 1956 regarded it as conceivable that a large enough nuclear war with megaton-range surface detonations could lift enough soil to cause a new ice age.[46]
The 1966 RAND corporation memorandum The Effects of Nuclear War on the Weather and Climate by E. S. Batten, while primarily analysing potential dust effects from surface bursts, notes that "in addition to the effects of the debris, extensive fires ignited by nuclear detonations might change the surface characteristics of the area and modify local weather patterns ... however, a more thorough knowledge of the atmosphere is necessary to determine their exact nature, extent, and magnitude."[47]
In the United States National Research Council (NRC) book Long-Term Worldwide Effects of Multiple Nuclear-Weapons Detonations published in 1975, it states that a nuclear war involving 4,000 Mt from present arsenals would probably deposit much less dust in the stratosphere than the Krakatoa eruption, judging that the effect of dust and oxides of nitrogen would probably be slight climatic cooling which "would probably lie within normal global climatic variability, but the possibility of climatic changes of a more dramatic nature cannot be ruled out".[48]
In the 1985 report, The Effects on the Atmosphere of a Major Nuclear Exchange, the Committee on the Atmospheric Effects of Nuclear Explosions argues that a "plausible" estimate on the amount of stratospheric dust injected following a surface burst of 1 Mt is 0.3 teragrams, of which 8 percent would be in the micrometer range. The potential cooling from soil dust was again looked at in 1992, in a US National Academy of Sciences (NAS)[49] report on geoengineering, which estimated that about 1010 kg (10 teragrams) of stratospheric injected soil dust with particulate grain dimensions of 0.1 to 1 micrometer would be required to mitigate the warming from a doubling of atmospheric carbon dioxide, that is, to produce ~2 °C of cooling.[50]
In 1969, Paul Crutzen discovered that oxides of nitrogen (NOx) could be an efficient catalyst for the destruction of the ozone layer/stratospheric ozone. Following studies on the potential effects of NOx generated by engine heat in stratosphere flying Supersonic Transport (SST) airplanes in the 1970s, in 1974, John Hampson suggested in the journal Nature that due to the creation of atmospheric NOx by nuclear fireballs, a full-scale nuclear exchange could result in depletion of the ozone shield, possibly subjecting the earth to ultraviolet radiation for a year or more.[51] In 1975, Hampson's hypothesis "led directly" to the United States National Research Council (NRC) reporting on the models of ozone depletion following nuclear war in the book Long-Term Worldwide Effects of Multiple Nuclear-Weapons Detonations.
In the section of this 1975 NRC book pertaining to the issue of fireball generated NOx and ozone layer loss therefrom, the NRC presented model calculations from the early-to-mid 1970s on the effects of a nuclear war with the use of large numbers of multi-megaton yield detonations, which returned conclusions that this could reduce ozone levels by 50 percent or more in the northern hemisphere.
However, independent of the computer models presented in the 1975 NRC works, a paper in 1973 in the journal Nature depicts the stratospheric ozone levels worldwide overlaid upon the number of nuclear detonations during the era of atmospheric testing. The authors conclude that neither the data nor their models show any correlation between the approximate 500 Mt in historical atmospheric testing and an increase or decrease of ozone concentration.[52] In 1976, a study on the experimental measurements of an earlier atmospheric nuclear test as it affected the ozone layer also found that nuclear detonations are exonerated of depleting ozone, after the at first alarming model calculations of the time.[53] Similarly, a 1981 paper found that the models on ozone destruction from one test and the physical measurements taken were in disagreement, as no destruction was observed.[54]
In total, about 500 Mt were atmospherically detonated between 1945 and 1971,[55] peaking in 1961–1962, when 340 Mt were detonated in the atmosphere by the United States and Soviet Union.[56] During this peak, with the multi-megaton range detonations of the two nations nuclear test series, in exclusive examination, a total yield estimated at 300 Mt of energy was released. Due to this, 3 × 1034 additional molecules of nitric oxide (about 5,000 tons per Mt, 5 × 109 grams per megaton)[52] [57] are believed to have entered the stratosphere, and while ozone depletion of 2.2 percent was noted in 1963, the decline had started prior to 1961 and is believed to have been caused by other meteorological effects.[52]
In 1982 journalist Jonathan Schell in his popular and influential book The Fate of the Earth, introduced the public to the belief that fireball generated NOx would destroy the ozone layer to such an extent that crops would fail from solar UV radiation and then similarly painted the fate of the Earth, as plant and aquatic life going extinct. In the same year, 1982, Australian physicist Brian Martin, who frequently corresponded with John Hampson who had been greatly responsible for much of the examination of NOx generation, penned a short historical synopsis on the history of interest in the effects of the direct NOx generated by nuclear fireballs, and in doing so, also outlined Hampson's other non-mainstream viewpoints, particularly those relating to greater ozone destruction from upper-atmospheric detonations as a result of any widely used anti-ballistic missile (ABM-1 Galosh) system.[58] However, Martin ultimately concludes that it is "unlikely that in the context of a major nuclear war" ozone degradation would be of serious concern. Martin describes views about potential ozone loss and therefore increases in ultraviolet light leading to the widespread destruction of crops, as advocated by Jonathan Schell in The Fate of the Earth, as highly unlikely.
More recent accounts on the specific ozone layer destruction potential of NOx species are much less than earlier assumed from simplistic calculations, as "about 1.2 million tons" of natural and anthropogenic generated stratospheric NOx is believed to be formed each year according to Robert P. Parson in the 1990s.[59]
The first published suggestion that cooling of the climate could be an effect of a nuclear war, appears to have been originally put forth by Poul Anderson and F. N. Waldrop in their story "Tomorrow's Children", in the March 1947 issue of the Astounding Science Fiction magazine. The story, primarily about a team of scientists hunting down mutants,[60] warns of a "Fimbulwinter" caused by dust that blocked sunlight after a recent nuclear war and speculated that it may even trigger a new Ice Age.[61] [62] Anderson went on to publish a novel based partly on this story in 1961, titling it Twilight World. Similarly in 1985 it was noted by T. G. Parsons that the story "Torch" by C. Anvil, which also appeared in Astounding Science Fiction magazine, but in the April 1957 edition, contains the essence of the "Twilight at Noon"/"nuclear winter" hypothesis. In the story, a nuclear warhead ignites an oil field, and the soot produced "screens out part of the sun's radiation", resulting in Arctic temperatures for much of the population of North America and the Soviet Union.
The 1988 Air Force Geophysics Laboratory publication, An assessment of global atmospheric effects of a major nuclear war by H. S. Muench, et al., contains a chronology and review of the major reports on the nuclear winter hypothesis from 1983 to 1986. In general, these reports arrive at similar conclusions as they are based on "the same assumptions, the same basic data", with only minor model-code differences. They skip the modeling steps of assessing the possibility of fire and the initial fire plumes and instead start the modeling process with a "spatially uniform soot cloud" which has found its way into the atmosphere.
Although never openly acknowledged by the multi-disciplinary team who authored the most popular 1980s TTAPS model, in 2011 the American Institute of Physics states that the TTAPS team (named for its participants, who had all previously worked on the phenomenon of dust storms on Mars, or in the area of asteroid impact events: Richard P. Turco, Owen Toon, Thomas P. Ackerman, James B. Pollack and Carl Sagan) announcement of their results in 1983 "was with the explicit aim of promoting international arms control".[63] However, "the computer models were so simplified, and the data on smoke and other aerosols were still so poor, that the scientists could say nothing for certain".[63]
In 1981, William J. Moran began discussions and research in the National Research Council (NRC) on the airborne soil/dust effects of a large exchange of nuclear warheads, having seen a possible parallel in the dust effects of a war with that of the asteroid-created K-T boundary and its popular analysis a year earlier by Luis Alvarez in 1980. An NRC study panel on the topic met in December 1981 and April 1982 in preparation for the release of the NRC's The Effects on the Atmosphere of a Major Nuclear Exchange, published in 1985.
As part of a study on the creation of oxidizing species such as NOx and ozone in the troposphere after a nuclear war,[64] launched in 1980 by Ambio, a journal of the Royal Swedish Academy of Sciences, Paul J. Crutzen and John W. Birks began preparing for the 1982 publication of a calculation on the effects of nuclear war on stratospheric ozone, using the latest models of the time. However, they found that as a result of the trend towards more numerous but less energetic, sub-megaton range nuclear warheads (made possible by the march to increase ICBM warhead accuracy), the ozone layer danger was "not very significant".[7]
It was after being confronted with these results that they "chanced" upon the notion, as "an afterthought" of nuclear detonations igniting massive fires everywhere and, crucially, the smoke from these conventional fires then going on to absorb sunlight, causing surface temperatures to plummet.[7] In early 1982, the two circulated a draft paper with the first suggestions of alterations in short-term climate from fires presumed to occur following a nuclear war. Later in the same year, the special issue of Ambio devoted to the possible environmental consequences of nuclear war by Crutzen and Birks was titled "The Atmosphere after a Nuclear War: Twilight at Noon", and largely anticipated the nuclear winter hypothesis. The paper looked into fires and their climatic effect and discussed particulate matter from large fires, nitrogen oxide, ozone depletion and the effect of nuclear twilight on agriculture. Crutzen and Birks' calculations suggested that smoke particulates injected into the atmosphere by fires in cities, forests and petroleum reserves could prevent up to 99 percent of sunlight from reaching the Earth's surface. This darkness, they said, could exist "for as long as the fires burned", which was assumed to be many weeks, with effects such as: "The normal dynamic and temperature structure of the atmosphere would...change considerably over a large fraction of the Northern Hemisphere, which will probably lead to important changes in land surface temperatures and wind systems." An implication of their work was that a successful nuclear decapitation strike could have severe climatic consequences for the perpetrator.
After reading a paper by N. P. Bochkov and E. I. Chazov,[65] published in the same edition of Ambio that carried Crutzen and Birks's paper "Twilight at Noon", Soviet atmospheric scientist Georgy Golitsyn applied his research on Mars dust storms to soot in the Earth's atmosphere. The use of these influential Martian dust storm models in nuclear winter research began in 1971,[66] when the Soviet spacecraft Mars 2 arrived at the red planet and observed a global dust cloud. The orbiting instruments together with the 1971 Mars 3 lander determined that temperatures on the surface of the red planet were considerably colder than temperatures at the top of the dust cloud. Following these observations, Golitsyn received two telegrams from astronomer Carl Sagan, in which Sagan asked Golitsyn to "explore the understanding and assessment of this phenomenon". Golitsyn recounts that it was around this time that he had "proposed a theory to explain how Martian dust may be formed and how it may reach global proportions."[66]
In the same year Alexander Ginzburg,[67] an employee in Golitsyn's institute, developed a model of dust storms to describe the cooling phenomenon on Mars. Golitsyn felt that his model would be applicable to soot after he read a 1982 Swedish magazine dedicated to the effects of a hypothetical nuclear war between the USSR and the US.[66] Golitsyn would use Ginzburg's largely unmodified dust-cloud model with soot assumed as the aerosol in the model instead of soil dust and in an identical fashion to the results returned, when computing dust-cloud cooling in the Martian atmosphere, the cloud high above the planet would be heated while the planet below would cool drastically. Golitsyn presented his intent to publish this Martian-derived Earth-analog model to the Andropov instigated Committee of Soviet Scientists in Defence of Peace Against the Nuclear Threat in May 1983, an organization that Golitsyn would later be appointed vice-chairman. The establishment of this committee was done with the expressed approval of the Soviet leadership with the intent "to expand controlled contacts with Western "nuclear freeze" activists".[68] Having gained this committees approval, in September 1983, Golitsyn published the first computer model on the nascent "nuclear winter" effect in the widely read Herald of the Russian Academy of Sciences.[69]
On 31 October 1982, Golitsyn and Ginsburg's model and results were presented at the conference on "The World after Nuclear War", hosted in Washington, D.C.[67]
Both Golitsyn[69] and Sagan[70] had been interested in the cooling on the dust storms on the planet Mars in the years preceding their focus on "nuclear winter". Sagan had also worked on Project A119 in the 1950s–1960s, in which he attempted to model the movement and longevity of a plume of lunar soil.
After the publication of "Twilight at Noon" in 1982, the TTAPS team have said that they began the process of doing a 1-dimensional computational modeling study of the atmospheric consequences of nuclear war/soot in the stratosphere, though they would not publish a paper in Science magazine until late-December 1983.[71] The phrase "nuclear winter" had been coined by Turco just prior to publication.[72] In this early paper, TTAPS used assumption-based estimates on the total smoke and dust emissions that would result from a major nuclear exchange, and with that, began analyzing the subsequent effects on the atmospheric radiation balance and temperature structure as a result of this quantity of assumed smoke. To compute dust and smoke effects, they employed a one-dimensional microphysics/radiative-transfer model of the Earth's lower atmosphere (up to the mesopause), which defined only the vertical characteristics of the global climate perturbation.
Interest in the environmental effects of nuclear war, however, had continued in the Soviet Union after Golitsyn's September paper, with Vladimir Alexandrov and G. I. Stenchikov also publishing a paper in December 1983 on the climatic consequences, although in contrast to the contemporary TTAPS paper, this paper was based on simulations with a three-dimensional global circulation model. (Two years later Alexandrov disappeared under mysterious circumstances). Richard Turco and Starley L. Thompson were both critical of the Soviet research. Turco called it "primitive" and Thompson said it used obsolete US computer models. Later they were to rescind these criticisms and instead applauded Alexandrov's pioneering work, saying that the Soviet model shared the weaknesses of all the others.[8]
In 1984, the World Meteorological Organization (WMO) commissioned Golitsyn and N. A. Phillips to review the state of the science. They found that studies generally assumed a scenario where half of the world's nuclear weapons would be used, ~5000 Mt, destroying approximately 1,000 cities, and creating large quantities of carbonaceous smoke – 1– being most likely, with a range of 0.2– (NAS; TTAPS assumed). The smoke resulting would be largely opaque to solar radiation but transparent to infrared, thus cooling the Earth by blocking sunlight, but not creating warming by enhancing the greenhouse effect. The optical depth of the smoke can be much greater than unity. Forest fires resulting from non-urban targets could increase aerosol production further. Dust from near-surface explosions against hardened targets also contributes; each megaton-equivalent explosion could release up to five million tons of dust, but most would quickly fall out; high altitude dust is estimated at 0.1–1 million tons per megaton-equivalent of explosion. Burning of crude oil could also contribute substantially.[73]
The 1-D radiative-convective models used in these studies produced a range of results, with cooling up to 15–42 °C between 14 and 35 days after the war, with a "baseline" of about 20 °C. Somewhat more sophisticated calculations using 3-D GCMs produced similar results: temperature drops of about 20 °C, though with regional variations.[74]
All calculations show large heating (up to 80 °C) at the top of the smoke layer at about ; this implies a substantial modification of the circulation there and the possibility of advection of the cloud into low latitudes and the southern hemisphere.
In a 1990 paper entitled "Climate and Smoke: An Appraisal of Nuclear Winter", TTAPS gave a more detailed description of the short- and long-term atmospheric effects of a nuclear war using a three-dimensional model:
First one to three months:
Following one to three years:
See main article: Kuwaiti oil fires.
One of the major results of TTAPS' 1990 paper was the re-iteration of the team's 1983 model that 100 oil refinery fires would be sufficient to bring about a small scale, but still globally deleterious nuclear winter.[75]
Following Iraq's invasion of Kuwait and Iraqi threats of igniting the country's approximately 800 oil wells, speculation on the cumulative climatic effect of this, presented at the World Climate Conference in Geneva that November in 1990, ranged from a nuclear winter type scenario, to heavy acid rain and even short term immediate global warming.[76]
In articles printed in the Wilmington Morning Star and the Baltimore Sun newspapers in January 1991, prominent authors of nuclear winter papers – Richard P. Turco, John W. Birks, Carl Sagan, Alan Robock and Paul Crutzen – collectively stated that they expected catastrophic nuclear winter like effects with continental-sized effects of sub-freezing temperatures as a result of the Iraqis going through with their threats of igniting 300 to 500 pressurized oil wells that could subsequently burn for several months.[77] [78]
As threatened, the wells were set on fire by the retreating Iraqis in March 1991, and the 600 or so burning oil wells were not fully extinguished until November 6, 1991, eight months after the end of the war,[79] and they consumed an estimated six million barrels of oil per day at their peak intensity.
When Operation Desert Storm began in January 1991, coinciding with the first few oil fires being lit, Dr. S. Fred Singer and Carl Sagan discussed the possible environmental effects of the Kuwaiti petroleum fires on the ABC News program Nightline. Sagan again argued that some of the effects of the smoke could be similar to the effects of a nuclear winter, with smoke lofting into the stratosphere, beginning around 48000feet above sea level in Kuwait, resulting in global effects. He also argued that he believed the net effects would be very similar to the explosion of the Indonesian volcano Tambora in 1815, which resulted in the year 1816 being known as the "Year Without a Summer".
Sagan listed modeling outcomes that forecast effects extending to South Asia, and perhaps to the Northern Hemisphere as well. Sagan stressed this outcome was so likely that "It should affect the war plans."[80] Singer, on the other hand, anticipated that the smoke would go to an altitude of about 3000feet and then be rained out after about three to five days, thus limiting the lifetime of the smoke. Both height estimates made by Singer and Sagan turned out to be wrong, albeit with Singer's narrative being closer to what transpired, with the comparatively minimal atmospheric effects remaining limited to the Persian Gulf region, with smoke plumes, in general, lofting to about 10000feet and a few as high as 20000feet.[81] [82]
Sagan and his colleagues expected that a "self-lofting" of the sooty smoke would occur when it absorbed the sun's heat radiation, with little to no scavenging occurring, whereby the black particles of soot would be heated by the sun and lifted/lofted higher and higher into the air, thereby injecting the soot into the stratosphere, a position where they argued it would take years for the sun-blocking effect of this aerosol of soot to fall out of the air, and with that, catastrophic ground level cooling and agricultural effects in Asia and possibly the Northern Hemisphere as a whole.[83] In a 1992 follow-up, Peter Hobbs and others had observed no appreciable evidence for the nuclear winter team's predicted massive "self-lofting" effect and the oil-fire smoke clouds contained less soot than the nuclear winter modelling team had assumed.[84]
The atmospheric scientist tasked with studying the atmospheric effect of the Kuwaiti fires by the National Science Foundation, Peter Hobbs, stated that the fires' modest impact suggested that "some numbers [used to support the Nuclear Winter hypothesis]... were probably a little overblown."[85]
Hobbs found that at the peak of the fires, the smoke absorbed 75 to 80% of the sun's radiation. The particles rose to a maximum of 20000feet, and when combined with scavenging by clouds the smoke had a short residency time of a maximum of a few days in the atmosphere.[86]
Pre-war claims of wide scale, long-lasting, and significant global environmental effects were thus not borne out, and found to be significantly exaggerated by the media and speculators,[87] with climate models by those not supporting the nuclear winter hypothesis at the time of the fires predicting only more localized effects such as a daytime temperature drop of ~10 °C within 200 km of the source.[88]
Sagan later conceded in his book The Demon-Haunted World that his predictions obviously did not turn out to be correct: "it was pitch black at noon and temperatures dropped 4–6 °C over the Persian Gulf, but not much smoke reached stratospheric altitudes and Asia was spared."[89]
The idea of oil well and oil reserve smoke pluming into the stratosphere serving as a main contributor to the soot of a nuclear winter was a central idea of the early climatology papers on the hypothesis; they were considered more of a possible contributor than smoke from cities, as the smoke from oil has a higher ratio of black soot, thus absorbing more sunlight.[90] [71] Hobbs compared the papers' assumed "emission factor" or soot generating efficiency from ignited oil pools and found, upon comparing to measured values from oil pools at Kuwait, which were the greatest soot producers, the emissions of soot assumed in the nuclear winter calculations were still "too high".[86] Following the results of the Kuwaiti oil fires being in disagreement with the core nuclear winter promoting scientists, 1990s nuclear winter papers generally attempted to distance themselves from suggesting oil well and reserve smoke will reach the stratosphere.
In 2007, a nuclear winter study noted that modern computer models have been applied to the Kuwait oil fires, finding that individual smoke plumes are not able to loft smoke into the stratosphere, but that smoke from fires covering a large area like some forest fires can lift smoke into the stratosphere, and recent evidence suggests that this occurs far more often than previously thought.[5] [16] [91] [92] The study also suggested that the burning of the comparably smaller cities, which would be expected to follow a nuclear strike, would also loft significant amounts of smoke into the stratosphere:
However, the above simulation notably contained the assumption that no dry or wet deposition would occur.
Between 1990 and 2003, commentators noted that no peer-reviewed papers on "nuclear winter" were published.[75]
Based on new work published in 2007 and 2008 by some of the authors of the original studies, several new hypotheses have been put forth, primarily the assessment that as few as 100 firestorms would result in a nuclear winter.[2] However, far from the hypothesis being "new", it drew the same conclusion as earlier 1980s models, which similarly regarded 100 or so city firestorms as a threat.[93]
Compared to climate change for the past millennium, even the smallest exchange modeled would plunge the planet into temperatures colder than the Little Ice Age (the period of history between approximately 1600 and 1850 AD). This would take effect instantly, and agriculture would be severely threatened. Larger amounts of smoke would produce larger climate changes, making agriculture impossible for years. In both cases, new climate model simulations show that the effects would last for more than a decade.
A study published in the Journal of Geophysical Research in July 2007, titled "Nuclear winter revisited with a modern climate model and current nuclear arsenals: Still catastrophic consequences",[14] used current climate models to look at the consequences of a global nuclear war involving most or all of the world's current nuclear arsenals (which the authors judged to be one similar to the size of the world's arsenals twenty years earlier). The authors used a global circulation model, ModelE from the NASA Goddard Institute for Space Studies, which they noted "has been tested extensively in global warming experiments and to examine the effects of volcanic eruptions on climate". The model was used to investigate the effects of a war involving the entire current global nuclear arsenal, projected to release about 150 Tg of smoke into the atmosphere, as well as a war involving about one third of the current nuclear arsenal, projected to release about 50 Tg of smoke. In the 150 Tg case they found that:
In addition, they found that this cooling caused a weakening of the global hydrological cycle, reducing global precipitation by about 45%. As for the 50 Tg case involving one third of current nuclear arsenals, they said that the simulation "produced climate responses very similar to those for the 150 Tg case, but with about half the amplitude," but that "the time scale of response is about the same". They did not discuss the implications for agriculture in depth, but noted that a 1986 study which assumed no food production for a year projected that "most of the people on the planet would run out of food and starve to death by then" and commented that their own results show that, "This period of no food production needs to be extended by many years, making the impacts of nuclear winter even worse than previously thought."
In 2014, Michael J. Mills (at the US National Center for Atmospheric Research, NCAR), et al., published "Multi-decadal global cooling and unprecedented ozone loss following a regional nuclear conflict" in the journal Earth's Future.[94] The authors used computational models developed by NCAR to simulate the climatic effects of a soot cloud that they suggest would be a result of a regional nuclear war in which 100 "small" (15 Kt) weapons are detonated over cities. The model had outputs, due to the interaction of the soot cloud:
...global ozone losses of 20–50% over populated areas, levels unprecedented in human history, would accompany the coldest average surface temperatures in the last 1000 years. We calculate summer enhancements in UV indices of 30–80% over Mid-Latitudes, suggesting widespread damage to human health, agriculture, and terrestrial and aquatic ecosystems. Killing frosts would reduce growing seasons by 10–40 days per year for 5 years. Surface temperatures would be reduced for more than 25 years, due to thermal inertia and albedo effects in the ocean and expanded sea ice. The combined cooling and enhanced UV would put significant pressures on global food supplies and could trigger a global nuclear famine.
Researchers at Los Alamos National Laboratory published the results of a multi-scale study of the climate impact of a regional nuclear exchange, the same scenario considered by Robock et al. and by Toon et al. in 2007. Unlike previous studies, this study simulated the processes whereby black carbon would be lofted into the atmosphere and found that very little would be lofted into the stratosphere and, as a result, the long-term climate impacts were much lower than those studies had concluded. In particular, "none of the simulations produced a nuclear winter effect", and "the probability of significant global cooling from a limited exchange scenario as envisioned in previous studies is highly unlikely".[95] This study has been contradicted by results in several subsequent studies claiming the 2018 study to be flawed.[96] [97] [98] [99]
Research published in the peer-reviewed journal Safety suggested that no nation should possess more than 100 nuclear warheads because of the blowback effect on the aggressor nation's own population because of "nuclear autumn".[100] [101]
2019 saw the publication of two studies on nuclear winter that build on previous modeling and describe new scenarios of nuclear winter from smaller exchanges of nuclear weapons than have been previously simulated.
As in the 2007 study by Robock et al.,[14] a 2019 study by Coupe et al. models a scenario in which 150 Tg of black carbon is released into the atmosphere following an exchange of nuclear weapons between the United States and Russia where both countries use all of the nuclear weapons treaties permit them to.[102] This amount of black carbon far exceeds that which has been emitted in the atmosphere by all volcanic eruptions in the past 1,200 years but is less than the asteroid impact which caused a mass extinction event 66 million years ago. Coupe et al. used the "whole atmosphere community climate model version 4" (WACCM4), which has a higher resolution and is more effective at simulating aerosols and stratospheric chemistry than the ModelE simulation used by Robock et al.
The WACCM4 model simulates that black carbon molecules increase to ten times their normal size when they reach the stratosphere. ModelE did not account for this effect. This difference in black carbon particle size results in a greater optical depth in the WACCM4 model across the world for the first two years after the initial injection due to greater absorption of sunlight in the stratosphere. This will have the effect of increasing stratospheric temperatures by 100K and result in ozone depletion that is slightly greater than ModelE predicted. Another consequence of the larger particle size is accelerating the rate at which black carbon molecules fall out of the atmosphere; after ten years from the injection of black carbon into the atmosphere, WACCM4 predicts 2 Tg will remain, while ModelE predicted 19 Tg.
The 2019 model and the 2007 model both predict significant temperature decreases across the globe, however the increased resolution and particle simulation in 2019 predict a greater temperature anomaly in the first six years after injection but a faster return to normal temperatures. Between a few months after the injection to the sixth year of anomaly, the WACCM4 predicts cooler global temperatures than ModelE, with temperatures more than 20K below normal leading to freezing temperatures during the summer months over much of the northern hemisphere leading to a 90% reduction in agricultural growing seasons in the midlatitudes, including the midwestern United States. WACCM4 simulations also predict a 58% reduction in global annual precipitation from normal levels in years three and four after injection, a 10% higher reduction than predicted in ModelE.
Toon et al. simulated a nuclear scenario in 2025 where India and Pakistan engage in a nuclear exchange in which 100 urban areas in Pakistan and 150 urban areas in India are attacked with nuclear weapons ranging from 15 kt to 100 kt and examined the effects of black carbon released into the atmosphere from airburst-only detonations.[4] The researchers modeled the atmospheric effects if all weapons were 15 kt, 50 kt, and 100 kt, providing a range where a nuclear exchange would likely fall into given the recent nuclear tests performed by both nations. The ranges provided are large because neither India nor Pakistan is obligated to provide information on their nuclear arsenals, so their extent remains largely unknown.
Toon et al. assume that either a firestorm or conflagration will occur after each detonation of the weapons, and the amount of black carbon inserted into the atmosphere from the two outcomes will be equivalent and of a profound extent; in Hiroshima in 1945, it is predicted that the firestorm released 1,000 times more energy than was released during the nuclear explosion. Such a large area being burned would release large amounts of black carbon into the atmosphere. The amount released ranges from 16.1 Tg if all weapons were 15 kt or less to 36.6 Tg for all 100 kt weapons. For the 15 kt and 100kt range of weapons, the researchers modeled global precipitation reductions of 15% to 30%, temperature reductions between 4K and 8K, and ocean temperature decreases of 1K to 3K. If all weapons used were 50 kt or more, Hadley cell circulation would be disrupted and cause a 50% decrease in precipitation in the American midwest. Net primary productivity (NPP) for oceans decreases from 10% to 20% for the 15 kt and 100 kt scenarios, respectively, while land NPP decreases between 15% and 30%; particularly affected are midlatitude agricultural regions in the United States and Europe, experiencing 25-50% reductions in NPP. As predicted by other literature, once the black carbon is removed from the atmosphere after ten years, temperatures and NPP will return to normal.
Coupe et al. report the simulation of a El Niño effect lasting several years after six nuclear scenarios ranging from 5 to 150 Tg soot under the CESM-WACCM4 model. They term the change a "Nuclear Niño" and describe various changes in the ocean currents.[103]
According to a peer-reviewed study published in the journal Nature Food in August 2022,[11] a full-scale nuclear war between the United States and Russia, which together hold more than 90% of the world's nuclear weapons, would kill 360 million people directly and more than 5 billion indirectly by starvation during a nuclear winter.[104] [105]
Another paper published that year, from the Tohoku University Earth science scholar Kunio Kaiho, compared the impact of nuclear winter scenarios on marine and terrestrial animal life with that of historical extinction events. Kaiho estimated that a minor nuclear war (which he defined as a nuclear exchange between India and Pakistan or an event of equivalent magnitude) would cause extinctions of 10–20% of species on its own, while a major nuclear war (defined as a nuclear exchange between United States and Russia) would cause the extinctions of 40–50% of animal species, which is comparable to some of the "Big Five" mass extinction events. For comparison, what he considered the most likely scenario of anthropogenic climate change, with 3C-change of warming by 2100 and 3.8C-change by 2500, would send around 12–14% of animal species extinct under the same methodology.[106]
Since 2023, the U.S. National Academies of Science, Engineering, and Medicine has established an Independent Study on Potential Environmental Effects of Nuclear War. The aim is to evaluate all research on nuclear winter, and the final report will be issued in 2024.[107]
The five major and largely independent underpinnings that the nuclear winter concept has and continues to receive criticism over are regarded as:[95] [108]
While the highly popularized initial 1983 TTAPS 1-dimensional model forecasts were widely reported and criticized in the media, in part because every later model predicts far less of its "apocalyptic" level of cooling,[109] most models continue to suggest that some deleterious global cooling would still result, under the assumption that a large number of fires occurred in the spring or summer.[75] [110] Starley L. Thompson's less primitive mid-1980s 3-dimensional model, which notably contained the very same general assumptions, led him to coin the term "nuclear autumn" to more accurately describe the climate results of the soot in this model, in an on camera interview in which he dismisses the earlier "apocalyptic" models.[111]
A major criticism of the assumptions that continue to make these model results possible appeared in the 1987 book Nuclear War Survival Skills (NWSS), a civil defense manual by Cresson Kearny for the Oak Ridge National Laboratory.[112] According to the 1988 publication An assessment of global atmospheric effects of a major nuclear war, Kearny's criticisms were directed at the excessive amount of soot that the modelers assumed would reach the stratosphere. Kearny cited a Soviet study that modern cities would not burn as firestorms, as most flammable city items would be buried under non-combustible rubble and that the TTAPS study included a massive overestimate on the size and extent of non-urban wildfires that would result from a nuclear war. The TTAPS authors responded that, amongst other things, they did not believe target planners would intentionally blast cities into rubble, but instead argued fires would begin in relatively undamaged suburbs when nearby sites were hit, and partially conceded his point about non-urban wildfires. Dr. Richard D. Small, director of thermal sciences at the Pacific-Sierra Research Corporation similarly disagreed strongly with the model assumptions, in particular the 1990 update by TTAPS that argues that some 5,075 Tg of material would burn in a total US-Soviet nuclear war, as analysis by Small of blueprints and real buildings returned a maximum of 1,475 Tg of material that could be burned, "assuming that all the available combustible material was actually ignited".[108]
Although Kearny was of the opinion that future more accurate models would, "indicate there will be even smaller reductions in temperature", including future potential models that did not so readily accept that firestorms would occur as dependably as nuclear winter modellers assume, in NWSS Kearny summarized the comparatively moderate cooling estimate of no more than a few days, from the 1986 Nuclear Winter Reappraised model by Starley Thompson and Stephen Schneider.[113] This was done in an effort to convey to his readers that contrary to the popular opinion at the time, in the conclusion of these two climate scientists, "on scientific grounds the global apocalyptic conclusions of the initial nuclear winter hypothesis can now be relegated to a vanishing low level of probability".
However, a 1988 article by Brian Martin in Science and Public Policy states that—although Nuclear Winter Reappraised concluded the US-Soviet "nuclear winter" would be much less severe than originally thought, with the authors describing the effects more as a "nuclear autumn"—other statements by Thompson and Schneider[114] [115] show that they, "resisted the interpretation that this means a rejection of the basic points made about nuclear winter". In the Alan Robock et al. 2007 paper, they write that, "because of the use of the term 'nuclear autumn' by Thompson and Schneider [1986], even though the authors made clear that the climatic consequences would be large, in policy circles the theory of nuclear winter is considered by some to have been exaggerated and disproved [e.g., Martin, 1988]."[14] In 2007 Schneider expressed his tentative support for the cooling results of the limited nuclear war (Pakistan and India) analyzed in the 2006 model, saying, "The sun is much stronger in the tropics than it is in mid-latitudes. Therefore, a much more limited war [there] could have a much larger effect, because you are putting the smoke in the worst possible place", and "anything that you can do to discourage people from thinking that there is any way to win anything with a nuclear exchange is a good idea".[116]
The contribution of smoke from the ignition of live non-desert vegetation, living forests, grasses and so on, nearby to many missile silos is a source of smoke originally assumed to be very large in the initial "Twilight at Noon" paper, and also found in the popular TTAPS publication. However, this assumption was examined by Bush and Small in 1987 and they found that the burning of live vegetation could only conceivably contribute very slightly to the estimated total "nonurban smoke production". With the vegetation's potential to sustain burning only probable if it is within a radius or two from the surface of the nuclear fireball, which is at a distance that would also experience extreme blast winds that would influence any such fires.[117] This reduction in the estimate of the non-urban smoke hazard is supported by the earlier preliminary Estimating Nuclear Forest Fires publication of 1984, and by the 1950–1960s in-field examination of surface-scorched, mangled but never burnt-down tropical forests on the surrounding islands from the shot points in the Operation Castle[118] and Operation Redwing[119] [120] test series.
A paper by the United States Department of Homeland Security, finalized in 2010, states that after a nuclear detonation targeting a city "If fires are able to grow and coalesce, a firestorm could develop that would be beyond the abilities of firefighters to control. However experts suggest in the nature of modern US city design and construction may make a raging firestorm unlikely".[121] The nuclear bombing of Nagasaki for example, did not produce a firestorm.[122] This was similarly noted as early as 1986–1988, when the assumed quantity of fuel "mass loading" (the amount of fuel per square meter) in cities underpinning the winter models was found to be too high and intentionally creates heat fluxes that loft smoke into the lower stratosphere, yet assessments "more characteristic of conditions" to be found in real-world modern cities, had found that the fuel loading, and hence the heat flux that would result from efficient burning, would rarely loft smoke much higher than 4 km.
Russell Seitz, Associate of the Harvard University Center for International Affairs, argues that the winter models' assumptions give results which the researchers want to achieve and is a case of "worst-case analysis run amok".[110] In September 1986, Seitz published "Siberian fire as 'nuclear winter' guide" in the journal Nature, in which he investigated the 1915 Siberian fire, which started in the early summer months and was caused by the worst drought in the region's recorded history. The fire ultimately devastated the region, burning the world's largest boreal forest, the size of Germany. While approximately 8˚C of daytime summer cooling occurred under the smoke clouds during the weeks of burning, no increase in potentially devastating agricultural night frosts occurred.[123] Following his investigation into the Siberian fire of 1915, Seitz criticized the "nuclear winter" model results for being based on successive worst-case events:
Seitz cited Carl Sagan, adding an emphasis: "In almost any realistic case involving nuclear exchanges between the superpowers, global environmental changes sufficient to cause an extinction event equal to or more severe than that of the close of the Cretaceous when the dinosaurs and many other species died out are likely." Seitz comments: "The ominous rhetoric italicized in this passage puts even the 100 megaton [the original 100 city firestorm] scenario ... on a par with the 100 million megaton blast of an asteroid striking the Earth. This [is] astronomical mega-hype ..."[109] Seitz concludes:
Seitz's opposition caused the proponents of nuclear winter to issue responses in the media. The proponents believed it was simply necessary to show only the possibility of climatic catastrophe, often a worst-case scenario, while opponents insisted that to be taken seriously, nuclear winter should be shown as likely under "reasonable" scenarios. One of these areas of contention, as elucidated by Lynn R. Anspaugh, is upon the question of which season should be used as the backdrop for the US-USSR war models. Most models choose the summer in the Northern Hemisphere as the start point to produce the maximum soot lofting and therefore eventual winter effect. However, it has been pointed out that if the same number of firestorms occurred in the autumn or winter months, when there is much less intense sunlight to loft soot into a stable region of the stratosphere, the magnitude of the cooling effect would be negligible, according to a January model run by Covey et al.[124] Schneider conceded the issue in 1990, saying "a war in late fall or winter would have no appreciable [cooling] effect".[108]
Anspaugh also expressed frustration that although a managed forest fire in Canada on 3 August 1985 is said to have been lit by proponents of nuclear winter, with the fire potentially serving as an opportunity to do some basic measurements of the optical properties of the smoke and smoke-to-fuel ratio, which would have helped refine the estimates of these critical model inputs, the proponents did not indicate that any such measurements were made. Peter V. Hobbs, who would later successfully attain funding to fly into and sample the smoke clouds from the Kuwait oil fires in 1991, also expressed frustration that he was denied funding to sample the Canadian, and other forest fires in this way. Turco wrote a 10-page memorandum with information derived from his notes and some satellite images, claiming that the smoke plume reached 6 km in altitude.
In 1986, atmospheric scientist Joyce Penner from the Lawrence Livermore National Laboratory published an article in Nature in which she focused on the specific variables of the smoke's optical properties and the quantity of smoke remaining airborne after the city fires. She found that the published estimates of these variables varied so widely that depending on which estimates were chosen the climate effect could be negligible, minor or massive.[125] The assumed optical properties for black carbon in more recent nuclear winter papers in 2006 are still "based on those assumed in earlier nuclear winter simulations".[14]
John Maddox, editor of the journal Nature, issued a series of skeptical comments about nuclear winter studies during his tenure.[126] [127] Similarly S. Fred Singer was a long term vocal critic of the hypothesis in the journal and in televised debates with Carl Sagan.[128] [129] [8]
In a 2011 response to the more modern papers on the hypothesis, Russell Seitz published a comment in Nature challenging Alan Robock's claim that there has been no real scientific debate about the "nuclear winter" concept.[130] In 1986 Seitz also contends that many others are reluctant to speak out for fear of being stigmatized as "closet Dr. Strangeloves"; physicist Freeman Dyson of Princeton for example stated "It's an absolutely atrocious piece of science, but I quite despair of setting the public record straight."[109] According to the Rocky Mountain News, Stephen Schneider had been called a fascist by some disarmament supporters for having written his 1986 article "Nuclear Winter Reappraised." MIT meteorologist Kerry Emanuel similarly wrote in a review in Nature that the winter concept is "notorious for its lack of scientific integrity" due to the unrealistic estimates selected for the quantity of fuel likely to burn, the imprecise global circulation models used. Emanuel ends by stating that the evidence of other models point to substantial scavenging of the smoke by rain.[131] Emanuel also made an "interesting point" about questioning proponents' objectivity when it came to strong emotional or political views that they hold.
William R. Cotton, Professor of Atmospheric Science at Colorado State University, specialist in cloud physics modeling and co-creator of the highly influential[132] [133] and previously mentioned RAMS atmosphere model, had in the 1980s worked on soot rain-out models and supported the predictions made by his own and other nuclear winter models. However, he has since reversed this position, according to a book co-authored by him in 2007, stating that, amongst other systematically examined assumptions, far more rain out/wet deposition of soot will occur than is assumed in modern papers on the subject: "We must wait for a new generation of GCMs to be implemented to examine potential consequences quantitatively". He also reveals that, in his view, "nuclear winter was largely politically motivated from the beginning".[1] [26]
During the Cuban Missile Crisis, Fidel Castro and Che Guevara called on the USSR to launch a nuclear first strike against the US in the event of a US invasion of Cuba. In the 1980s, Castro was pressuring the Kremlin to adopt a harder line against the US under President Ronald Reagan, even arguing for the potential use of nuclear weapons. As a direct result of this, a Soviet official was dispatched to Cuba in 1985 with an entourage of "experts", who detailed the ecological effect on Cuba in the event of nuclear strikes on the United States. Soon after, the Soviet official recounts, Castro lost his prior "nuclear fever".[134] [135] In 2010, Alan Robock was summoned to Cuba to help Castro promote his new view that nuclear war would bring about Armageddon. Robock's 90 minute lecture was later aired on the nationwide state-controlled television station in the country.[136] [137]
However, according to Robock, insofar as getting US government attention and affecting nuclear policy, he has failed. In 2009, together with Owen Toon, he gave a talk to the United States Congress, but nothing transpired from it and the then-presidential science adviser, John Holdren, did not respond to their requests in 2009 or at the time of writing in 2011.[137]
In a 2012 "Bulletin of the Atomic Scientists" feature, Robock and Toon, who had routinely mixed their disarmament advocacy into the conclusions of their "nuclear winter" papers,[14] argue in the political realm that the hypothetical effects of nuclear winter necessitates that the doctrine they assume is active in Russia and US, "mutually assured destruction" (MAD), should instead be replaced with their own "self-assured destruction" (SAD) concept, because, regardless of whose cities burned, the effects of the resultant nuclear winter that they advocate would be, in their view, catastrophic. In a similar vein, in 1989 Carl Sagan and Richard Turco wrote a policy implications paper that appeared in Ambio that suggested that as nuclear winter is a "well-established prospect", both superpowers should jointly reduce their nuclear arsenals to "Canonical Deterrent Force" levels of 100–300 individual warheads each, such that in "the event of nuclear war [this] would minimize the likelihood of [extreme] nuclear winter."[138]
An originally classified 1984 US interagency intelligence assessment states that in both the preceding 1970s and 1980s, the Soviet and US military were already following the "existing trends" in warhead miniaturization, of higher accuracy and lower yield nuclear warheads. This is seen when assessing the most numerous physics packages in the US arsenal, which in the 1960s were the B28 and W31, however, both quickly became less prominent with the 1970s mass production runs of the 50 Kt W68, the 100 Kt W76 and in the 1980s, with the B61.[139] This trend towards miniaturization, enabled by advances in inertial guidance and accurate GPS navigation etc., was motivated by a multitude of factors, namely the desire to leverage the physics of equivalent megatonnage that miniaturization offered; of freeing up space to fit more MIRV warheads and decoys on each missile. Alongside the desire to still destroy hardened targets but while reducing the severity of fallout collateral damage depositing on neighboring, and potentially friendly, countries. As it relates to the likelihood of nuclear winter, the range of potential thermal radiation ignited fires was already reduced with miniaturization. For example, the most popular nuclear winter paper, the 1983 TTAPS paper, had described a 3000 Mt counterforce attack on ICBM sites with each individual warhead having approximately one Mt of energy; however not long after publication, Michael Altfeld of Michigan State University and political scientist Stephen Cimbala of Pennsylvania State University argued that the then already developed and deployed smaller, more accurate warheads (e.g. W76), together with lower detonation heights, could produce the same counterforce strike with a total of only 3 Mt of energy being expended. They continue that, if the nuclear winter models prove to be representative of reality, then far less climatic-cooling would occur, even if firestorm prone areas existed in the target list, as lower fusing heights such as surface bursts would also limit the range of the burning thermal rays due to terrain masking and shadows cast by buildings, while also temporarily lofting far more localized fallout when compared to airburst fuzing – the standard mode of employment against un-hardened targets. This logic is similarly reflected in the originally classified 1984 Interagency Intelligence assessment, which suggests that targeting planners would simply have to consider target combustibility along with yield, height of burst, timing and other factors to reduce the amount of smoke to safeguard against the potentiality of a nuclear winter. Therefore, as a consequence of attempting to limit the target fire hazard by reducing the range of thermal radiation with fuzing for surface and sub-surface bursts, this will result in a scenario where the far more concentrated, and therefore deadlier, local fallout that is generated following a surface burst forms, as opposed to the comparatively dilute global fallout created when nuclear weapons are fuzed in air burst mode.[140]
Altfeld and Cimbala also argued that belief in the possibility of nuclear winter would actually make nuclear war more likely, contrary to the views of Sagan and others, because it would serve yet further motivation to follow the existing trends, towards the development of more accurate, and even lower explosive yield, nuclear weapons. As the winter hypothesis suggests that the replacement of the then Cold War viewed strategic nuclear weapons in the multi-megaton yield range, with weapons of explosive yields closer to tactical nuclear weapons, such as the Robust Nuclear Earth Penetrator (RNEP), would safeguard against the nuclear winter potential. With the latter capabilities of the then, largely still conceptual RNEP, specifically cited by the influential nuclear warfare analyst Albert Wohlstetter. Tactical nuclear weapons, on the low end of the scale have yields that overlap with large conventional weapons and are therefore often viewed "as blurring the distinction between conventional and nuclear weapons", making the prospect of using them "easier" in a conflict.[141] [142]
In an interview in 2000 with Mikhail Gorbachev (the leader of the Soviet Union from 1985 to 1991), the following statement was posed to him: "In the 1980s, you warned about the unprecedented dangers of nuclear weapons and took very daring steps to reverse the arms race", with Gorbachev replying "Models made by Russian and American scientists showed that a nuclear war would result in a nuclear winter that would be extremely destructive to all life on Earth; the knowledge of that was a great stimulus to us, to people of honor and morality, to act in that situation."[143]
However, a 1984 US Interagency Intelligence Assessment expresses a far more skeptical and cautious approach, stating that the hypothesis is not scientifically convincing. The report predicted that Soviet nuclear policy would be to maintain their strategic nuclear posture, such as their fielding of the high throw-weight SS-18 missile and they would merely attempt to exploit the hypothesis for propaganda purposes, such as directing scrutiny on the US portion of the nuclear arms race. Moreover, it goes on to express the belief that if Soviet officials did begin to take nuclear winter seriously, it would probably make them demand exceptionally high standards of scientific proof for the hypothesis, as the implications of it would undermine their military doctrine – a level of scientific proof which perhaps could not be met without field experimentation. The un-redacted portion of the document ends with the suggestion that substantial increases in Soviet Civil defense food stockpiles might be an early indicator that Nuclear Winter was beginning to influence Soviet upper echelon thinking.
In 1985, Time magazine noted "the suspicions of some Western scientists that the nuclear winter hypothesis was promoted by Moscow to give anti-nuclear groups in the U.S. and Europe some fresh ammunition against America's arms buildup."[144] In 1985, the United States Senate met to discuss the science and politics of nuclear winter. During the congressional hearing, the influential analyst Leon Gouré presented evidence that perhaps the Soviets have simply echoed Western reports rather than producing unique findings. Gouré hypothesized that Soviet research and discussions of nuclear war may serve only Soviet political agendas, rather than to reflect actual opinions of Soviet leadership.[145]
In 1986, the Defense Nuclear Agency document An update of Soviet research on and exploitation of Nuclear winter 1984–1986 charted the minimal [public domain] research contribution on, and Soviet propaganda usage of, the nuclear winter phenomenon.
There is some doubt as to when the Soviet Union began modelling fires and the atmospheric effects of nuclear war. Former Soviet intelligence officer Sergei Tretyakov claimed that, under the directions of Yuri Andropov, the KGB invented the concept of "nuclear winter" in order to stop the deployment of NATO Pershing II missiles. They are said to have distributed to peace groups, the environmental movement and the journal Ambio disinformation based on a faked "doomsday report" by the Soviet Academy of Sciences by Georgii Golitsyn, Nikita Moiseyev and Vladimir Alexandrov concerning the climatic effects of nuclear war.[146] Although it is accepted that the Soviet Union exploited the nuclear winter hypothesis for propaganda purposes, Tretyakov's inherent claim that the KGB funnelled disinformation to Ambio, the journal in which Paul Crutzen and John Birks published the 1982 paper "Twilight at Noon", has not been corroborated . In an interview in 2009 conducted by the National Security Archive, Vitalii Nikolaevich Tsygichko (a Senior Analyst at the Soviet Academy of Sciences and military mathematical modeler) stated that Soviet military analysts were discussing the idea of "nuclear winter" years before U.S. scientists, although they did not use that exact term.[147]
See also: Conflict Resolution. A number of solutions have been proposed to mitigate the potential harm of a nuclear winter if one appears inevitable. The problem has been attacked at both ends; some solutions focus on preventing the growth of fires and therefore limiting the amount of smoke that reaches the stratosphere in the first place, and others focus on food production with reduced sunlight, with the assumption that the very worst-case analysis results of the nuclear winter models prove accurate and no other mitigation strategies are fielded.
In a report from 1967, techniques included various methods of applying liquid nitrogen, dry ice, and water to nuclear-caused fires.[148] The report considered attempting to stop the spread of fires by creating firebreaks by blasting combustible material out of an area, possibly even using nuclear weapons, along with the use of preventative Hazard Reduction Burns. According to the report, one of the most promising techniques investigated was initiation of rain from seeding of mass-fire thunderheads and other clouds passing over the developing, and then stable, firestorm.
In the book Feeding Everyone No Matter What, under the worst-case scenario predictions of nuclear winter, the authors present various unconventional food possibilities. These include natural-gas-digesting bacteria, the most well known being Methylococcus capsulatus, that is presently used as a feed in fish farming;[149] bark bread, a long-standing famine food using the edible inner bark of trees, and part of Scandinavian history during the Little Ice Age; increased fungiculture or mushrooms such as the honey fungi that grow directly on moist wood without sunlight;[150] and variations of wood or cellulosic biofuel production, which typically already creates edible sugars/xylitol from inedible cellulose, as an intermediate product before the final step of alcohol generation.[151] [152] One of the book's authors, mechanical engineer David Denkenberger, states that mushrooms could theoretically feed everyone for three years. Seaweed, like mushrooms, can also grow in low-light conditions. Dandelions and tree needles could provide Vitamin C, and bacteria could provide Vitamin E. More conventional cold-weather crops such as potatoes might get sufficient sunlight at the equator to remain feasible.[153]
To feed portions of civilization through a nuclear winter, large stockpiles of food storage prior to the event would have to be accomplished. Such stockpiles should be placed underground, at higher elevations and near the equator to mitigate high altitude UV and radioactive isotopes. Stockpiles should also be placed near populations most likely to survive the initial catastrophe. One consideration is who would sponsor the stockpiling. "There may be a mismatch between those most able to sponsor the stockpiles (i.e., the pre-catastrophe wealthy) and those most able to use the stockpiles (the pre-catastrophe rural poor)."[154] The minimum annual global wheat storage is approximately 2 months.[155]
See main article: Climate engineering. Despite the name "nuclear winter", nuclear events are not necessary to produce the modeled climatic effect. In an effort to find a quick and cheap solution to the global warming projection of at least 2 ˚C of surface warming as a result of the doubling in CO2 levels within the atmosphere, through solar radiation management (a form of climate engineering) the underlying nuclear winter effect has been looked at as perhaps holding potential. Besides the more common suggestion to inject sulfur compounds into the stratosphere to approximate the effects of a volcanic winter, the injection of other chemical species such as the release of a particular type of soot particle to create minor "nuclear winter" conditions, has been proposed by Paul Crutzen and others.[156] [157] According to the threshold "nuclear winter" computer models,[2] [10] if one to five teragrams of firestorm-generated soot[24] is injected into the low stratosphere, it is modeled, through the anti-greenhouse effect, to heat the stratosphere but cool the lower troposphere and produce 1.25 °C cooling for two to three years; and after 10 years, average global temperatures would still be 0.5 °C lower than before the soot injection.[10]
See also: Tunguska event. Similar climatic effects to "nuclear winter" followed historical supervolcano eruptions, which plumed sulfate aerosols high into the stratosphere, with this being known as a volcanic winter.[158] The effects of smoke in the atmosphere (short wave absorption) are sometimes termed an "antigreenhouse" effect, and a strong analog is the hazy atmosphere of Titan. Pollack, Toon and others were involved in developing models of Titan's climate in the late 1980s, at the same time as their early nuclear winter studies.[159]
Similarly, extinction-level comet and asteroid impacts are also believed to have generated impact winters by the pulverization of massive amounts of fine rock dust. This pulverized rock can also produce "volcanic winter" effects, if sulfate-bearing rock is hit in the impact and lofted high into the air,[160] and "nuclear winter" effects, with the heat of the heavier rock ejecta igniting regional and possibly even global forest firestorms.[161] [162]
This global "impact firestorms" hypothesis, initially supported by Wendy Wolbach, H. Jay Melosh and Owen Toon, suggests that as a result of massive impact events, the small sand-grain-sized ejecta fragments created can meteorically re-enter the atmosphere forming a hot blanket of global debris high in the air, potentially turning the entire sky red-hot for minutes to hours, and with that, burning the complete global inventory of above-ground carbonaceous material, including rain forests.[163] [164] This hypothesis is suggested as a means to explain the severity of the Cretaceous–Paleogene extinction event, as the earth impact of an asteroid about 10 km wide which precipitated the extinction is not regarded as sufficiently energetic to have caused the level of extinction from the initial impact's energy release alone.
The global firestorm winter, however, has been questioned in more recent years (2003–2013) by Claire Belcher,[165] [166] Tamara Goldin[167] [168] [169] and Melosh, who had initially supported the hypothesis,[170] [171] with this re-evaluation being dubbed the "Cretaceous-Palaeogene firestorm debate" by Belcher.The issues raised by these scientists in the debate are the perceived low quantity of soot in the sediment beside the fine-grained iridium-rich asteroid dust layer, if the quantity of re-entering ejecta was perfectly global in blanketing the atmosphere, and if so, the duration and profile of the re-entry heating, whether it was a high thermal pulse of heat or the more prolonged and therefore more incendiary "oven" heating,[170] and finally, how much the "self-shielding effect" from the first wave of now-cooled meteors in dark flight contributed to diminishing the total heat experienced on the ground from later waves of meteors.[163]
In part due to the Cretaceous period being a high-atmospheric-oxygen era, with concentrations above that of the present day, Owen Toon et al. in 2013 were critical of the re-evaluations the hypothesis is undergoing.
It is difficult to successfully ascertain the percentage contribution of the soot in this period's geological sediment record from living plants and fossil fuels present at the time,[172] in much the same manner that the fraction of the material ignited directly by the meteor impact is difficult to determine.