The history of the scientific discovery of climate change began in the early 19th century when ice ages and other natural changes in paleoclimate were first suspected and the natural greenhouse effect was first identified. In the late 19th century, scientists first argued that human emissions of greenhouse gases could change Earth's energy balance and climate. The existence of the greenhouse effect, while not named as such, was proposed as early as 1824 by Joseph Fourier.[1] The argument and the evidence were further strengthened by Claude Pouillet in 1827 and 1838. In 1856 Eunice Newton Foote demonstrated that the warming effect of the sun is greater for air with water vapour than for dry air, and the effect is even greater with carbon dioxide.[2] [3]
John Tyndall was the first to measure the infrared absorption and emission of various gases and vapors. From 1859 onwards, he showed that the effect was due to a very small proportion of the atmosphere, with the main gases having no effect, and was largely due to water vapor, though small percentages of hydrocarbons and carbon dioxide had a significant effect.[4] The effect was more fully quantified by Svante Arrhenius in 1896, who made the first quantitative prediction of global warming due to a hypothetical doubling of atmospheric carbon dioxide.
In the 1960s, the evidence for the warming effect of carbon dioxide gas became increasingly convincing. Scientists also discovered that human activities that generated atmospheric aerosols (e.g., "air pollution") could have cooling effects as well (later referred to as global dimming). Other theories for the causes of global warming were also proposed, involving forces from volcanism to solar variation. During the 1970s, scientific understanding of global warming greatly increased.
By the 1990s, as the result of improving the accuracy of computer models and observational work confirming the Milankovitch theory of the ice ages, a consensus position formed. It became clear that greenhouse gases were deeply involved in most climate changes and human-caused emissions were bringing discernible global warming.
Since the 1990s, scientific research on climate change has included multiple disciplines and has expanded. Research has expanded the understanding of causal relations, links with historic data, and abilities to measure and model climate change. Research during this period has been summarized in the Assessment Reports by the Intergovernmental Panel on Climate Change, with the First Assessment Report coming out in 1990.
See also: Holocene. From ancient times, people suspected that the climate of a region could change over the course of centuries. For example, Theophrastus, a pupil of Ancient Greek philosopher Aristotle in the 4th century BC, told how the draining of marshes had made a particular locality more susceptible to freezing, and speculated that lands became warmer when the clearing of forests exposed them to sunlight. In the 1st century BC, Roman writer and architect Vitruvius wrote about climate in relation to housing architecture and how to choose locations for cities.[5] [6] Renaissance European and later scholars saw that deforestation, irrigation, and grazing had altered the lands around the Mediterranean since ancient times; they thought it plausible that these human interventions had affected the local weather.[7] [8] In his book published in 1088, Northern Song dynasty Chinese scholar and statesman Shen Kuo promoted the theory of gradual climate change over centuries of time once ancient petrified bamboos were found to be preserved underground in the dry climate zone and arid northern region of Yanzhou, now modern day Yan'an, Shaanxi province, far from the warmer, wetter climate areas of China where bamboos typically grow.[9] [10]
The 18th and 19th-century conversion of Eastern North America from forest to croplands brought obvious change within a human lifetime. From the early 19th century, many believed the transformation was altering the region's climate—probably for the better. When farmers in America, dubbed "sodbusters", took over the Great Plains, they held that "rain follows the plow".[11] [12] Other experts disagreed, and some argued that deforestation caused rapid rainwater run-off and flooding, and could even result in reduced rainfall. European academics, suggesting that the temperate zones inhabited by the "Caucasian race" were naturally superior for the spread of civilization, proffered that the Orientals of the Ancient Near East had heedlessly converted their once lush lands into impoverished deserts.[13]
Meanwhile, national weather agencies had begun to compile masses of reliable observations of temperature, rainfall, and the like. When these figures were analyzed, they showed many rises and dips, but no steady long-term change. By the end of the 19th century, scientific opinion had turned decisively against any belief in a human influence on climate. And whatever the regional effects, few imagined that humans could affect the climate of the planet as a whole.[13]
See main article: paleoclimatology, ice age and greenhouse effect.
From the mid-17th century, naturalists attempted to reconcile mechanical philosophy with theology, initially within a biblical timescale. By the late 18th century, there was increasing acceptance of prehistoric epochs. Geologists found evidence of a succession of geological ages with climate changes. There were various competing theories about these changes; Buffon proposed that the Earth had begun as an incandescent globe and was very gradually cooling. James Hutton, whose ideas of cyclic change over huge periods were later dubbed uniformitarianism, was among those who found signs of past glacial activity in places too warm for glaciers in modern times.[14]
In 1815 Jean-Pierre Perraudin described for the first time how glaciers might be responsible for the giant boulders seen in alpine valleys. As he hiked in the Val de Bagnes, he noticed giant granite rocks that were scattered around the narrow valley. He knew that it would take an exceptional force to move such large rocks. He also noticed how glaciers left stripes on the land and concluded that it was the ice that had carried the boulders down into the valleys.[15]
His idea was initially met with disbelief. Jean de Charpentier wrote, "I found his hypothesis so extraordinary and even so extravagant that I considered it as not worth examining or even considering."[16] Despite Charpentier's initial rejection, Perraudin eventually convinced Ignaz Venetz that it might be worth studying. Venetz convinced Charpentier, who in turn convinced the influential scientist Louis Agassiz that the glacial theory had merit.
Agassiz developed a theory of what he termed "Ice Age"—when glaciers covered Europe and much of North America. In 1837 Agassiz was the first to scientifically propose that the Earth had been subject to a past ice age.[17] William Buckland had been a leading proponent in Britain of flood geology, later dubbed catastrophism, which accounted for erratic boulders and other "diluvium" as relics of the Biblical flood. This was strongly opposed by Charles Lyell's version of Hutton's uniformitarianism and was gradually abandoned by Buckland and other catastrophist geologists. A field trip to the Alps with Agassiz in October 1838 convinced Buckland that features in Britain had been caused by glaciation, and both he and Lyell strongly supported the ice age theory which became widely accepted by the 1870s.
Before the concept of ice ages was proposed, Joseph Fourier in 1824 reasoned based on physics that Earth's atmosphere kept the planet warmer than would be the case in a vacuum. Fourier recognized that the atmosphere transmitted visible light waves efficiently to the earth's surface. The earth then absorbed visible light and emitted infrared radiation in response, but the atmosphere did not transmit infrared efficiently, which therefore increased surface temperatures. He also suspected that human activities could influence the radiation balance and Earth's climate, although he focused primarily on land-use changes. In an 1827 paper, Fourier stated,[18]
The establishment and progress of human societies, the action of natural forces, can notably change, and in vast regions, the state of the surface, the distribution of water and the great movements of the air. Such effects are able to make to vary, in the course of many centuries, the average degree of heat; because the analytic expressions contain coefficients relating to the state of the surface and which greatly influence the temperature.Fourier's work built on previous discoveries: in 1681 Edme Mariotte noted that glass, though transparent to sunlight, obstructs radiant heat.[19] [20] Around 1774 Horace Bénédict de Saussure showed that non-luminous warm objects emit infrared heat, and used a glass-topped insulated box to trap and measure heat from sunlight.[21] [22]
The physicist Claude Pouillet proposed in 1838 that water vapor and carbon dioxide might trap infrared and warm the atmosphere, but there was still no experimental evidence of these gases absorbing heat from thermal radiation.[23]
The warming effect of sunlight on different gases was examined in 1856 by Eunice Newton Foote, who described her experiments using glass tubes exposed to sunlight. The warming effect of the sun was greater for compressed air than for an evacuated tube and greater for moist air than dry air. "Thirdly, the highest effect of the sun's rays I have found to be in carbonic acid gas." (carbon dioxide) She continued: "An atmosphere of that gas would give to our earth a high temperature; and if, as some suppose, at one period of its history, the air had mixed with it a larger proportion than at present, an increased temperature from its action, as well as from an increased weight, must have necessarily resulted." Her work was presented by Prof. Joseph Henry at the American Association for the Advancement of Science meeting in August 1856 and described as a brief note written by then journalist David Ames Wells; her paper was published later that year in the American Journal of Science and Arts. Few noticed the paper and it was only rediscovered in the 21st century,[24] [25] [26] [27]
John Tyndall took Fourier's work one step further in 1859 when he built an apparatus to investigate the absorption of infrared radiation in different gases. He found that water vapor, hydrocarbons like methane (CH4), and carbon dioxide strongly block the radiation. He understood that without these gases the planet would rapidly freeze.[28] [29]
Some scientists suggested that ice ages and other great climate changes were due to changes in the amount of gases emitted in volcanism. But that was only one of many possible causes. Another obvious possibility was solar variation. Shifts in ocean currents also might explain many climate changes. For changes over millions of years, the raising and lowering of mountain ranges would change patterns of both winds and ocean currents. Or perhaps the climate of a continent had not changed at all, but it had grown warmer or cooler because of polar wander (the North Pole shifting to where the Equator had been or the like). There were dozens of theories.
For example, in the mid-19th century, James Croll published calculations of how the gravitational pulls of the Sun, Moon, and planets subtly affect the Earth's motion and orientation. The inclination of the Earth's axis and the shape of its orbit around the Sun oscillate gently in cycles lasting tens of thousands of years. During some periods the Northern Hemisphere would get slightly less sunlight during the winter than it would get during other centuries. Snow would accumulate, reflecting sunlight and leading to a self-sustaining ice age.[30] Most scientists, however, found Croll's ideas—and every other theory of climate change—unconvincing.
In 1896 Svante Arrhenius used Langley's observations of increased infrared absorption where Moon rays pass through the atmosphere at a low angle, encountering more carbon dioxide, to estimate an atmospheric cooling effect from a future decrease of . He realized that the cooler atmosphere would hold less water vapor (another greenhouse gas) and calculated the additional cooling effect. He also realized the cooling would increase snow and ice cover at high latitudes, making the planet reflect more sunlight and thus further cool down, as James Croll had hypothesized. Overall Arrhenius calculated that cutting in half would suffice to produce an ice age. He further calculated that a doubling of atmospheric would give a total warming of 5–6 degrees Celsius.[33]
Further, Arrhenius' colleague Arvid Högbom, who was quoted in length in Arrhenius' 1896 study On the Influence of Carbonic Acid in the Air upon the Temperature of the Earth[34] had been attempting to quantify natural sources of emissions of for purposes of understanding the global carbon cycle. Högbom found that estimated carbon production from industrial sources in the 1890s (mainly coal burning) was comparable with the natural sources.[35] Arrhenius saw that this human emission of carbon would eventually lead to a warming energy imbalance. However, because of the relatively low rate of production in 1896, Arrhenius thought the warming would take thousands of years, and he expected it would be beneficial to humanity.[36] In 1908 he revised this prediction to take hundreds of years due to the ever increasing rate of fuel use and that within his lifetime this would benefit humanity. [37]
In 1899 Thomas Chrowder Chamberlin developed at length the idea that climate changes could result from changes in the concentration of atmospheric carbon dioxide.[38] Chamberlin wrote in his 1899 book, An Attempt to Frame a Working Hypothesis of the Cause of Glacial Periods on an Atmospheric Basis:
The term "greenhouse effect" for this warming was introduced by Nils Gustaf Ekholm in 1901.[39] [40]
Arrhenius's calculations were disputed and subsumed into a larger debate over whether atmospheric changes had caused the ice ages. Experimental attempts to measure infrared absorption in the laboratory seemed to show little differences resulted from increasing levels, and also found significant overlap between absorption by and absorption by water vapor, all of which suggested that increasing carbon dioxide emissions would have little climatic effect. These early experiments were later found to be insufficiently accurate, given the instrumentation of the time. Many scientists also thought that the oceans would quickly absorb any excess carbon dioxide.
Other theories of the causes of climate change fared no better. The principal advances were in observational paleoclimatology, as scientists in various fields of geology worked out methods to reveal ancient climates. In 1929, Wilmot H. Bradley found that annual varves of clay laid down in lake beds showed climate cycles. Andrew Ellicott Douglass saw strong indications of climate change in tree rings. Noting that the rings were thinner in dry years, he reported climate effects from solar variations, particularly in connection with the 17th-century dearth of sunspots (the Maunder Minimum) noticed previously by William Herschel and others. Other scientists, however, found good reason to doubt that tree rings could reveal anything beyond random regional variations. The value of tree rings for climate study was not solidly established until the 1960s.[41] [42]
Through the 1930s the most persistent advocate of a solar-climate connection was astrophysicist Charles Greeley Abbot. By the early 1920s, he had concluded that the solar "constant" was misnamed: his observations showed large variations, which he connected with sunspots passing across the face of the Sun. He and a few others pursued the topic into the 1960s, convinced that sunspot variations were a main cause of climate change. Other scientists were skeptical.[41] [42] Nevertheless, attempts to connect the solar cycle with climate cycles were popular in the 1920s and 1930s. Respected scientists announced correlations that they insisted were reliable enough to make predictions. Sooner or later, every prediction failed, and the subject fell into disrepute.[43]
Meanwhile, Milutin Milankovitch, building on James Croll's theory, improved the tedious calculations of the varying distances and angles of the Sun's radiation as the Sun and Moon gradually perturbed the Earth's orbit. Some observations of varves (layers seen in the mud covering the bottom of lakes) matched the prediction of a Milankovitch cycle lasting about 21,000 years. However, most geologists dismissed the astronomical theory. For they could not fit Milankovitch's timing to the accepted sequence, which had only four ice ages, all of them much longer than 21,000 years.[44]
In 1938 Guy Stewart Callendar attempted to revive Arrhenius's greenhouse-effect theory. Callendar presented evidence that both temperature and the level in the atmosphere had been rising over the past half-century, and he argued that newer spectroscopic measurements showed that the gas was effective in absorbing infrared in the atmosphere. Nevertheless, most scientific opinion continued to dispute or ignore the theory.[45]
Better spectrography in the 1950s showed that and water vapor absorption lines did not overlap completely. Climatologists also realized that little water vapor was present in the upper atmosphere. Both developments showed that the greenhouse effect would not be overwhelmed by water vapor.[46]
In 1955 Hans Suess's carbon-14 isotope analysis showed that released from fossil fuels was not immediately absorbed by the ocean. In 1957, better understanding of ocean chemistry led Roger Revelle to a realization that the ocean surface layer had limited ability to absorb carbon dioxide, also predicting the rise in levels of and later being proven by Charles David Keeling.[47] By the late 1950s, more scientists were arguing that carbon dioxide emissions could be a problem, with some projecting in 1959 that would rise 25% by the year 2000, with potentially "radical" effects on climate. In the centennial of the American oil industry in 1959, organized by the American Petroleum Institute and the Columbia Graduate School of Business, Edward Teller said "It has been calculated that a temperature rise corresponding to a 10 per cent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York. ... At present the carbon dioxide in the atmosphere has risen by 2 per cent over normal. By 1970, it will be perhaps 4 per cent, by 1980, 8 per cent, by 1990, 16 per cent if we keep on with our exponential rise in the use of purely conventional fuels."[48] In 1960 Charles David Keeling demonstrated that the level of in the atmosphere was in fact rising. Concern mounted year by year along with the rise of the "Keeling Curve" of atmospheric .
Another clue to the nature of climate change came in the mid-1960s from analysis of deep-sea cores by Cesare Emiliani and analysis of ancient corals by Wallace Broecker and collaborators. Rather than four long ice ages, they found a large number of shorter ones in a regular sequence. It appeared that the timing of ice ages was set by the small orbital shifts of the Milankovitch cycles. While the matter remained controversial, some began to suggest that the climate system is sensitive to small changes and can readily be flipped from a stable state into a different one.[44]
Scientists meanwhile began using computers to develop more sophisticated versions of Arrhenius's calculations. In 1967, taking advantage of the ability of digital computers to integrate absorption curves numerically, Syukuro Manabe and Richard Wetherald made the first detailed calculation of the greenhouse effect incorporating convection (the "Manabe-Wetherald one-dimensional radiative-convective model").[49] [50] They found that, in the absence of unknown feedbacks such as changes in clouds, a doubling of carbon dioxide from the current level would result in approximately 2 °C increase in global temperature. For this, and related work, Manabe was awarded a share of the 2021 Nobel Prize in Physics.[51]
By the 1960s, aerosol pollution ("smog") had become a serious local problem in many cities, and some scientists began to consider whether the cooling effect of particulate pollution could affect global temperatures. Scientists were unsure whether the cooling effect of particulate pollution or warming effect of greenhouse gas emissions would predominate, but regardless, began to suspect that human emissions could be disruptive to climate in the 21st century if not sooner. In his 1968 book The Population Bomb, Paul R. Ehrlich wrote, "the greenhouse effect is being enhanced now by the greatly increased level of carbon dioxide ... [this] is being countered by low-level clouds generated by contrails, dust, and other contaminants ... At the moment we cannot predict what the overall climatic results will be of our using the atmosphere as a garbage dump."[52]
Efforts to establish a global temperature record that began in 1938 culminated in 1963, when J. Murray Mitchell presented one of the first up-to-date temperature reconstructions. His study involved data from over 200 weather stations, collected by the World Weather Records, which was used to calculate latitudinal average temperature. In his presentation, Murray showed that, beginning in 1880, global temperatures increased steadily until 1940. After that, a multi-decade cooling trend emerged. Murray's work contributed to the overall acceptance of a possible global cooling trend.[53] [54]
In 1965, the landmark report "Restoring the Quality of Our Environment" by U.S. President Lyndon B. Johnson's Science Advisory Committee warned of the harmful effects of fossil fuel emissions:
The committee used the recently available global temperature reconstructions and carbon dioxide data from Charles David Keeling and colleagues to reach their conclusions. They declared the rise of atmospheric carbon dioxide levels to be the direct result of fossil fuel burning. The committee concluded that human activities were sufficiently large to have significant, global impact—beyond the area the activities take place. "Man is unwittingly conducting a vast geophysical experiment", the committee wrote.
Nobel Prize winner Glenn T. Seaborg, Chairperson of the United States Atomic Energy Commission warned of the climate crisis in 1966: "At the rate we are currently adding carbon dioxide to our atmosphere (six billion tons a year), within the next few decades the heat balance of the atmosphere could be altered enough to produce marked changes in the climate--changes which we might have no means of controlling even if by that time we have made great advances in our programs of weather modification."[55]
A 1968 study by the Stanford Research Institute for the American Petroleum Institute noted:[56]
In 1969, NATO was the first candidate to deal with climate change on an international level. It was planned then to establish a hub of research and initiatives of the organization in the civil area, dealing with environmental topics as acid rain and the greenhouse effect. The suggestion of US President Richard Nixon was not very successful with the administration of German Chancellor Kurt Georg Kiesinger. But the topics and the preparation work done on the NATO proposal by the German authorities gained international momentum, (see e.g. the Stockholm United Nations Conference on the Human Environment 1970) as the government of Willy Brandt started to apply them on the civil sphere instead.[57]
Also in 1969, Mikhail Budyko published a theory on the ice–albedo feedback, a foundational element of what is today known as Arctic amplification.[58] The same year a similar model was published by William D. Sellers.[59] Both studies attracted significant attention, since they hinted at the possibility for a runaway positive feedback within the global climate system.[60]
A 1969 memo from White House Urban Affairs Director Daniel Patrick Moynihan tried to impress the office of U.S. President Nixon with the projected severity of the greenhouse effect. However, action was not taken, even after a December 20, 1971 initiative from the Office of Science and Technology, "Determine the Climate Change Caused by Man and Nature".[61] In the initiative, Nixon's science advisors recommended an international network for monitoring climate trends and human impact on it.[62]
In the early 1970s, evidence that aerosols were increasing worldwide and that the global temperature series showed cooling encouraged Reid Bryson and some others to warn of the possibility of severe cooling. The questions and concerns put forth by Bryson and others launched a new wave of research into the factors of such global cooling. Meanwhile, the new evidence that the timing of ice ages was set by predictable orbital cycles suggested that the climate would gradually cool, over thousands of years. Several scientific panels from this time period concluded that more research was needed to determine whether warming or cooling was likely, indicating that the trend in the scientific literature had not yet become a consensus.[63] [64] [65] For the century ahead, however, a survey of the scientific literature from 1965 to 1979 found 7 articles predicting cooling and 44 predicting warming (many other articles on climate made no prediction); the warming articles were cited much more often in subsequent scientific literature. Research into warming and greenhouse gases held the greater emphasis, with nearly six times more studies predicting warming than predicting cooling, suggesting concern among scientists was largely over warming as they turned their attention toward the greenhouse effect.
John Sawyer published the study Man-made Carbon Dioxide and the "Greenhouse" Effect in 1972.[66] He summarized the knowledge of the science at the time, the anthropogenic attribution of the carbon dioxide greenhouse gas, distribution and exponential rise, findings which still hold today. Additionally he accurately predicted the rate of global warming for the period between 1972 and 2000.[67] [68]
The first satellite records compiled in the early 1970s showed snow and ice cover over the Northern Hemisphere to be increasing, prompting further scrutiny into the possibility of global cooling. J. Murray Mitchell updated his global temperature reconstruction in 1972, which continued to show cooling.[69] However, scientists determined that the cooling observed by Mitchell was not a global phenomenon. Global averages were changing, largely in part due to unusually severe winters experienced by Asia and some parts of North America in 1972 and 1973, but these changes were mostly constrained to the Northern Hemisphere. In the Southern Hemisphere, the opposite trend was observed. The severe winters, however, pushed the issue of global cooling into the public eye.
The mainstream news media at the time exaggerated the warnings of the minority who expected imminent cooling. For example, in 1975, Newsweek magazine published a story titled "The Cooling World" that warned of "ominous signs that the Earth's weather patterns have begun to change".[70] The article drew on studies documenting the increasing snow and ice in regions of the Northern Hemisphere and concerns and claims by Reid Bryson that global cooling by aerosols would dominate carbon dioxide warming. The article continued by stating that evidence of global cooling was so strong that meteorologists were having "a hard time keeping up with it". On 23 October 2006, Newsweek issued an update stating that it had been "spectacularly wrong about the near-term future".[71] Nevertheless, this article and others like it had long-lasting effects on public perception of climate science.
Such media coverage heralding the coming of a new ice age resulted in beliefs that this was the consensus among scientists, despite this is not being reflected by the scientific literature. As it became apparent that scientific opinion was in favor of global warming, the public began to express doubt over how trustworthy the science was. The argument that scientists were wrong about global cooling, so therefore may be wrong about global warming has been called "the "Ice Age Fallacy" by Time author Bryan Walsh.[72]
In the first two "Reports for the Club of Rome" in 1972[73] and 1974,[74] the anthropogenic climate changes by increase as well as by waste heat were mentioned. About the latter John Holdren wrote in a study[75] cited in the 1st report, "that global thermal pollution is hardly our most immediate environmental threat. It could prove to be the most inexorable, however, if we are fortunate enough to evade all the rest". Simple global-scale estimates[76] that recently have been actualized[77] and confirmed by more refined model calculations[78] [79] show noticeable contributions from waste heat to global warming after the year 2100, if its growth rates are not strongly reduced (below the averaged 2% p.a. which occurred since 1973).
Evidence for warming accumulated. By 1975, Manabe and Wetherald had developed a three-dimensional global climate model that gave a roughly accurate representation of the current climate. Doubling CO2 in the model's atmosphere gave a roughly 2 °C rise in global temperature.[80] Several other kinds of computer models gave similar results: it was impossible to make a model that gave something resembling the actual climate and not have the temperature rise when the CO2 concentration was increased.
In a separate development, an analysis of deep-sea cores published in 1976 by Nicholas Shackleton and colleagues showed that the dominating influence on ice age timing came from a 100,000-year Milankovitch orbital change. This was unexpected, since the change in sunlight in that cycle was slight. The result emphasized that the climate system is driven by feedbacks, and thus is strongly susceptible to small changes in conditions.
A 1977 memo (see quote box) from President Carter's chief science adviser Frank Press warned of the possibility of catastrophic climate change. However, other issues—such as known harms to health from pollutants, and avoiding energy dependence on other nations—seemed more pressing and immediate. Energy Secretary James Schlesinger advised that "the policy implications of this issue are still too uncertain to warrant Presidential involvement and policy initiatives", and the fossil fuel industry began sowing doubt about climate science.
The 1979 World Climate Conference (12 to 23 February) of the World Meteorological Organization concluded "it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to a gradual warming of the lower atmosphere, especially at higher latitudes. ... It is possible that some effects on a regional and global scale may be detectable before the end of this century and become significant before the middle of the next century."[81]
In July 1979 the United States National Research Council published a report,[82] concluding (in part):
One week before President Carter left office, the White House Council on Environmental Quality (CEQ) issued reports including a suggestion to limit global average temperature to 2°C above preindustrial levels, one goal agreed to in the 2015 Paris climate accord.[83]
See also: Climate change denial. By the early 1980s, the slight cooling trend from 1945 to 1975 had stopped. Aerosol pollution had decreased in many areas due to environmental legislation and changes in fuel use, and it became clear that the cooling effect from aerosols was not going to increase substantially while carbon dioxide levels were progressively increasing.
Hansen and others published the 1981 study Climate impact of increasing atmospheric carbon dioxide, and noted:
In 1982, Greenland ice cores drilled by Hans Oeschger, Willi Dansgaard, and collaborators revealed dramatic temperature oscillations in the space of a century in the distant past.[84] The most prominent of the changes in their record corresponded to the violent Younger Dryas climate oscillation seen in shifts in types of pollen in lake beds all over Europe. Evidently drastic climate changes were possible within a human lifetime.
In 1973 James Lovelock speculated that chlorofluorocarbons (CFCs) could have a global warming effect. In 1975 V. Ramanathan found that a CFC molecule could be 10,000 times more effective in absorbing infrared radiation than a carbon dioxide molecule, making CFCs potentially important despite their very low concentrations in the atmosphere. While most early work on CFCs focused on their role in ozone depletion, by 1985 Ramanathan and others showed that CFCs together with methane and other trace gases could have nearly as important a climate effect as increases in . In other words, global warming would arrive twice as fast as had been expected.[85]
In 1985 a joint UNEP/WMO/ICSU Conference on the "Assessment of the Role of Carbon Dioxide and Other Greenhouse Gases in Climate Variations and Associated Impacts" concluded that greenhouse gases "are expected" to cause significant warming in the next century and that some warming is inevitable.[86]
Meanwhile, ice cores drilled by a Franco-Soviet team at the Vostok Station in Antarctica showed that and temperature had gone up and down together in wide swings through past ice ages. This confirmed the -temperature relationship in a manner entirely independent of computer climate models, strongly reinforcing the emerging scientific consensus. The findings also pointed to powerful biological and geochemical feedbacks.[87]
In June 1988, James E. Hansen made one of the first assessments that human-caused warming had already measurably affected global climate.[88] Shortly after, a "World Conference on the Changing Atmosphere: Implications for Global Security" gathered hundreds of scientists and others in Toronto. They concluded that the changes in the atmosphere due to human pollution "represent a major threat to international security and are already having harmful consequences over many parts of the globe", and declared that by 2005 the world would be well-advised to push its emissions some 20% below the 1988 level.[89]
The 1980s saw important breakthroughs with regard to global environmental challenges. Ozone depletion was mitigated by the Vienna Convention (1985) and the Montreal Protocol (1987). Acid rain was mainly regulated on national and regional levels.
See main article: Intergovernmental Panel on Climate Change and Scientific consensus on climate change. In 1988 the WMO established the Intergovernmental Panel on Climate Change with the support of the UNEP. The IPCC continues its work through the present day, and issues a series of Assessment Reports and supplemental reports that describe the state of scientific understanding at the time each report is prepared. Scientific developments during this period are summarized about once every five to six years in the IPCC Assessment Reports which were published in 1990 (First Assessment Report), 1995 (Second Assessment Report), 2001 (Third Assessment Report), 2007 (Fourth Assessment Report), 2013/2014 (Fifth Assessment Report). and 2021 Sixth Assessment Report[90] The 2001 report was the first to state positively that the observed global temperature increase was "likely" to be due to human activities. The conclusion was influenced especially by the so-called hockey stick graph showing an abrupt historical temperature rise simultaneous with the rise of greenhouse gas emissions, and by observations of changes in ocean heat content that had a "signature" matching the pattern that computer models calculated for the effect of greenhouse warming. By the time of the 2021 report, scientists had much additional evidence. Above all, measurements of paleotemperatures from several eras in the distant past, and the record of temperature change since the mid 19th century, could be matched against measurements of levels to provide independent confirmation of supercomputer model calculations.
These developments depended crucially on huge globe-spanning observation programs. Since the 1990s research into historical and modern climate change expanded rapidly. International coordination was provided by the World Climate Research Programme (established in 1980) and was increasingly oriented around providing input to the IPCC reports. Measurement networks such as the Global Ocean Observing System, Integrated Carbon Observation System, and NASA's Earth Observing System enabled monitoring of the causes and effects of ongoing change. Research also broadened, linking many fields such as Earth sciences, behavioral sciences, economics, and security.
A historically important question in climate change research has regarded the relative importance of human activity and natural causes during the period of instrumental record. In the 1995 Second Assessment Report (SAR), the IPCC made the widely quoted statement that "The balance of evidence suggests a discernible human influence on global climate". The phrase "balance of evidence" suggested the (English) common-law standard of proof required in civil as opposed to criminal courts: not as high as "beyond reasonable doubt". In 2001 the Third Assessment Report (TAR) refined this, saying "There is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities".[91] The 2007 Fourth Assessment Report (AR4) strengthened this finding:
Other findings of the IPCC Fourth Assessment Report include:
Some results from scientific studies on this issue are listed below:
See also: Climate change denial.
The early work of Joseph Fourier found that a greenhouse heats up mainly due to radiation trapping. This is analogous to radiation trapping in the atmosphere, leading to the term "greenhouse effect".[104]
An experiment performed by Prof. R. W. Wood in 1909 led him to reject radiation trapping, claiming that a greenhouse is heated merely due to convection blocking.[105] This has become a widespread view in the scientific community.[106] [107] [108] [109] Moreover, Wood's result has been used to reject the analogy, and to doubt the existence of a greenhouse effect in the atmosphere.[110] [111] [112] [113] Recent experiments have discredited Wood's claim, finding that radiation trapping is the dominant cause of heating in a greenhouse.[114] [115] [116]
There have been attempts to raise public controversy over the accuracy of the instrumental temperature record on the basis of the urban heat island effect, the quality of the surface station network, and assertions that there have been unwarranted adjustments to the temperature record.[117] [118]
Weather stations that are used to compute global temperature records are not evenly distributed over the planet, and their distribution has changed over time. There were a small number of weather stations in the 1850s, and the number did not reach the current 3000+ until the 1951 to 1990 period[119]
The 2001 IPCC Third Assessment Report (TAR) acknowledged that the urban heat island is an important local effect, but cited analyses of historical data indicating that the effect of the urban heat island on the global temperature trend is no more than 0.05 °C (0.09 °F) degrees through 1990.[120] Peterson (2003) found no difference between the warming observed in urban and rural areas.[121]
Parker (2006) found that there was no difference in warming between calm and windy nights. Since the urban heat island effect is strongest for calm nights and is weak or absent on windy nights, this was taken as evidence that global temperature trends are not significantly contaminated by urban effects.[122] Pielke and Matsui published a paper disagreeing with Parker's conclusions.[123]
In 2005, Roger A. Pielke and Stephen McIntyre criticized the US instrumental temperature record and adjustments to it, and Pielke and others criticized the poor quality siting of a number of weather stations in the United States.[124] [125] A study in 2010 examined the siting of temperature stations and found that those measurement stations that were poorly showed a slight cool bias rather than the warm bias which deniers had postulated.[126] [127]
The Berkeley Earth Surface Temperature group carried out an independent assessment of land temperature records, which examined issues raised by deniers, such as the urban heat island effect, poor station quality, and the risk of data selection bias. The preliminary results, made public in October 2011, found that these factors had not biased the results obtained by NOAA, the Hadley Centre together with the Climatic Research Unit (HadCRUT) and NASA's GISS in earlier studies. The group also confirmed that over the past 50 years the land surface warmed by 0.911 °C, and their results closely matched those obtained from these earlier studies.[128] [129] [130] [131]
General circulation models and basic physical considerations predict that in the tropics the temperature of the troposphere should increase more rapidly than the temperature of the surface. A 2006 report to the U.S. Climate Change Science Program noted that models and observations agreed on this amplification for monthly and interannual time scales but not for decadal time scales in most observed data sets. Improved measurement and analysis techniques have reconciled this discrepancy: corrected buoy and satellite surface temperatures are slightly cooler and corrected satellite and radiosonde measurements of the tropical troposphere are slightly warmer.[132] Satellite temperature measurements show that tropospheric temperatures are increasing with "rates similar to those of the surface temperature", leading the IPCC to conclude in 2007 that this discrepancy is reconciled.[133]
See main article: solar variation.
Some climate change deniers have argued that solar variation is a significant contributor to the observed global warming, which would reduce the relative importance of human-made causes. However, this is not supported by scientific consensus on climate change. Scientists reject the notion that the warming observed in the global mean surface temperature record since about 1850 is the result of solar variations: "The observed rapid rise in global mean temperatures seen after 1985 cannot be ascribed to solar variability, whichever of the mechanisms is invoked and no matter how much the solar variation is amplified."[134]
The consensus position is that solar radiation may have increased by 0.12 W/m2 since 1750, compared to 1.6 W/m2 for the net anthropogenic forcing.[135] Already in 2001, the IPCC Third Assessment Report had found that, "The combined change in radiative forcing of the two major natural factors (solar variation and volcanic aerosols) is estimated to be negative for the past two, and possibly the past four, decades."[136]
A few studies say that the present level of solar activity is historically high as determined by sunspot activity and other factors. Solar activity could affect climate either by variation in the Sun's output or, more speculatively, by an indirect effect on the amount of cloud formation. Solanki and co-workers suggest that solar activity for the last 60 to 70 years may be at its highest level in 8,000 years, however they said "that solar variability is unlikely to have been the dominant cause of the strong warming during the past three decades", and concluded that "at the most 30% of the strong warming since [1970] can be of solar origin".[137] Others have disagreed with the study, suggesting that other comparably high levels of activity have occurred several times in the last few thousand years.[138] They concluded that "solar activity reconstructions tell us that only a minor fraction of the recent global warming can be explained by the variable Sun."[139]
The role of solar activity in climate change has also been calculated over longer time periods using "proxy" datasets, such as tree rings.[140] Models indicate that solar and volcanic forcings can explain periods of relative warmth and cold between AD 1000 and 1900, but human-induced forcings are needed to reproduce the late-20th century warming.[141]
Another line of evidence against the sun having caused recent climate change comes from looking at how temperatures at different levels in the Earth's atmosphere have changed.[142]
The US Environmental Protection Agency (US EPA, 2009) responded to public comments on climate change attribution.[143] A number of commenters had argued that recent climate change could be attributed to changes in solar irradiance. According to the US EPA (2009), this attribution was not supported by the bulk of the scientific literature. Citing the work of the IPCC (2007), the US EPA pointed to the low contribution of solar irradiance to radiative forcing since the start of the Industrial Revolution in 1750. Over this time period (1750 to 2005),[144] the estimated contribution of solar irradiance to radiative forcing was 5% the value of the combined radiative forcing due to increases in the atmospheric concentrations of carbon dioxide, methane and nitrous oxide (see graph opposite).
The role of the Sun in recent climate change has been looked at by climate scientists. Since 1978, output from the Sun has been measured by satellites[145] significantly more accurately than was previously possible from the surface. These measurements indicate that the Sun's total solar irradiance has not increased since 1978, so the warming during the past 30 years cannot be directly attributed to an increase in total solar energy reaching the Earth (see graph above, left). In the three decades since 1978, the combination of solar and volcanic activity probably had a slight cooling influence on the climate.[146]
Climate models have been used to examine the role of the Sun in recent climate change.[147] Models are unable to reproduce the rapid warming observed in recent decades when they only take into account variations in total solar irradiance and volcanic activity. Models are, however, able to simulate the observed 20th century changes in temperature when they include all of the most important external forcings, including human influences and natural forcings. As has already been stated, Hegerl et al. (2007) concluded that greenhouse gas forcing had "very likely" caused most of the observed global warming since the mid-20th century. In making this conclusion, Hegerl et al. (2007) allowed for the possibility that climate models had been underestimated the effect of solar forcing.[148]
Models and observations (see figure above, middle) show that greenhouse gas results in warming of the lower atmosphere at the surface (called the troposphere) but cooling of the upper atmosphere (called the stratosphere).[149] Depletion of the ozone layer by chemical refrigerants has also resulted in a cooling effect in the stratosphere. If the Sun was responsible for observed warming, warming of the troposphere at the surface and warming at the top of the stratosphere would be expected as increase solar activity would replenish ozone and oxides of nitrogen.[150] The stratosphere has a reverse temperature gradient than the troposphere so as the temperature of the troposphere cools with altitude, the stratosphere rises with altitude. Hadley cells are the mechanism by which equatorial generated ozone in the tropics (highest area of UV irradiance in the stratosphere) is moved poleward. Global climate models suggest that climate change may widen the Hadley cells and push the jetstream northward thereby expanding the tropics region and resulting in warmer, dryer conditions in those areas overall.[151]
Some have argued that the Sun is responsible for recently observed climate change.[152] Warming on Mars was quoted as evidence that global warming on Earth was being caused by changes in the Sun.[153] [154] [155] This has been discredited by scientists: "Wobbles in the orbit of Mars are the main cause of its climate change in the current era" (see also orbital forcing).[156] Also, there are alternative explanations of why warming had occurred on Triton, Pluto, Jupiter and Mars.
The view that cosmic rays could provide the mechanism by which changes in solar activity affect climate is not supported by the literature.[157] Solomon et al. (2007)[158] state:
[..] the cosmic ray time series does not appear to correspond to global total cloud cover after 1991 or to global low-level cloud cover after 1994. Together with the lack of a proven physical mechanism and the plausibility of other causal factors affecting changes in cloud cover, this makes the association between galactic cosmic ray-induced changes in aerosol and cloud formation controversialStudies in 2007 and 2008 found no relation between warming in recent decades and cosmic rays.[159] [160] Pierce and Adams (2009)[161] used a model to simulate the effect of cosmic rays on cloud properties. They concluded that the hypothesized effect of cosmic rays was too small to explain recent climate change. The authors of that study noted that their findings did not rule out a possible connection between cosmic rays and climate change, and recommended further research.[162]
Erlykin et al. (2009) found that the evidence showed that connections between solar variation and climate were more likely to be mediated by direct variation of insolation rather than cosmic rays, and concluded: "Hence within our assumptions, the effect of varying solar activity, either by direct solar irradiance or by varying cosmic ray rates, must be less than 0.07 °C since 1956, i.e. less than 14% of the observed global warming." Carslaw (2009) and Pittock (2009) reviewed the recent and historical literature in this field and continue to find that the link between cosmic rays and climate is tenuous, though they encourage continued research.
Henrik Svensmark has suggested that the magnetic activity of the sun deflects cosmic rays, and that this may influence the generation of cloud condensation nuclei, and thereby have an effect on the climate.[163]
In 2011, the United Nations Environment Programme looked at how world emissions might develop out to the year 2020 depending on different policy decisions. [164] They convened 55 scientists and experts from 28 scientific groups across 15 countries. Projections, assuming no new efforts to reduce emissions or based on the "business-as-usual" hypothetical trend,[165] suggested global emissions in 2020 of 56 gigatonnes -equivalent (Gt-eq), with a range of 55-59 Gt-eq. In adopting a different baseline where the pledges to the Copenhagen Accord were met in their most ambitious form, the projected global emission by 2020 will still reach the 50 gigatonnes .[166] Continuing with the current trend, particularly in the case of low-ambition form, there is an expectation of 3° Celsius temperature increase by the end of the century, which is estimated to bring severe environmental, economic, and social consequences.[167]
The report also considered the effect on emissions of policies put forward by UNFCCC Parties to address climate change. Assuming more stringent efforts to limit emissions lead to projected global emissions in 2020 of between 49 and 52 Gt-eq, with a median estimate of 51 Gt-eq. Assuming less stringent efforts to limit emissions lead to projected global emissions in 2020 of between 53 and 57 Gt-eq, with a median estimate of 55 Gt-eq.
Public-domain sources