The history of environmental pollution traces human-dominated ecological systems from the earliest civilizations to the present day.[1] This history is characterized by the increased regional success of a particular society, followed by crises that were either resolved, producing sustainability, or not, leading to decline.[2] [3] In early human history, the use of fire and desire for specific foods may have altered the natural composition of plant and animal communities.[4] Between 8,000 and 12,000 years ago, agrarian communities emerged which depended largely on their environment and the creation of a "structure of permanence."[5]
The Western Industrial Revolution of the 18th to 19th centuries tapped into the vast growth potential of the energy in fossil fuels. Coal was used to power ever more efficient engines and later to generate electricity. Modern sanitation systems and advances in medicine protected large populations from disease.[6] In the mid-20th century, a gathering environmental movement pointed out that there were environmental costs associated with the many material benefits that were now being enjoyed. In the late 20th century, environmental problems became global in scale.[7] [8] [9] [10] The 1973 and 1979 energy crises demonstrated the extent to which the global community had become dependent on non-renewable energy resources. By the 1970s, the ecological footprint of humanity exceeded the carrying capacity of earth, therefore the mode of life of humanity became unsustainable.[11] In the 21st century, there is increasing global awareness of the threat posed by global climate change, produced largely by the burning of fossil fuels.[12] [13] Another major threat is biodiversity loss, caused primarily by land use change.
See also: Neolithic Revolution. In early human history, although the energy and other resource demands of nomadic hunter-gatherers were small, the use of fire and desire for specific foods may have altered the natural composition of plant and animal communities. Between 8,000 and 10,000 years ago, agriculture emerged in various regions of the world.[14] Agrarian communities depended largely on their environment and the creation of a "structure of permanence". Societies outgrowing their local food supply or depleting critical resources either moved on or faced collapse.[15]
Archeological evidence suggests that the first civilizations arose in Sumer, in southern Mesopotamia (now Iraq) and Egypt, both dating from around 3000 BCE. By 1000 BCE, civilizations were also established in the Indus Valley, China, Mexico, Peru, and in parts of Europe.[16] [17] Sumer illustrates issues central to the sustainability of human civilization.[18] Sumerian cities practiced intensive, year-round agriculture from BCE. The surplus of storable food created by this economy allowed the population to settle in one place instead of migrating in search of wild foods and grazing land. It also allowed for a much greater population density. The development of agriculture in Mesopotamia required many labourers to build and maintain its irrigation system. This, in turn, led to political hierarchy, bureaucracy, and religious sanction, along with standing armies to protect the emergent civilization. Intensified agriculture allowed for population increase, but also led to deforestation in upstream areas with resultant flooding and over-irrigation, which raised soil salinity. While there was a shift from the cultivation of wheat to the more salt-tolerant barley, yields still diminished. Eventually, decreasing agricultural production and other factors led to the decline of the civilization. From 2100 BC to 1700 BC, it is estimated that the population was reduced by nearly sixty percent.[18] [19] Civilizations similarly thought to have eventually fallen because of poor management of resources include the Mayans, Anasazi and Easter Islanders, among many others.[20] [21] In contrast, stable communities of shifting cultivators and horticulturists existed in New Guinea and South America, and large agrarian communities in China, India and elsewhere have farmed in the same localities for centuries. Some Polynesian cultures have maintained stable communities for between 1,000 and 3,000 years on small islands with minimal resources using rahui[22] and kaitiakitanga[23] to control human pressure on the environment. In Sri Lanka nature reserves established during the reign of king Devanampiyatissa and dating back to 307 BC were devoted to sustainability and harmonious living with nature.[24]
See also: Fossil fuel. Technological advances over several millennia gave humans increasing control over the environment. But it was the Western Industrial Revolution of the 18th to 19th centuries that tapped into the vast growth potential of the energy in fossil fuels. Coal was used to power ever more efficient engines and later to generate electricity. Modern sanitation systems and advances in medicine protected large populations from disease.[25] Such conditions led to a human population explosion and unprecedented industrial, technological and scientific growth that has continued to this day, marking the commencement of a period of global human influence known as the Anthropocene. From 1650 to 1850, the global population doubled from around 500 million to 1 billion people.[26]
Concerns about the environmental and social impacts of industry were expressed by some Enlightenment political economists and through the Romantic movement of the 1800s. The Reverend Thomas Malthus, devised catastrophic and much-criticised theories of "overpopulation", while John Stuart Mill foresaw the desirability of a "stationary state" economy, thus anticipating concerns of the modern discipline of ecological economics.[27] [28] [29] In the late 19th century Eugenius Warming was the first botanist to study physiological relations between plants and their environment, heralding the scientific discipline of ecology.[30]
See also: Hans Carl von Carlowitz.
See also: Environmental movement. By the 20th century, the Industrial Revolution had led to an exponential increase in the human consumption of resources. The increase in health, wealth and population was perceived as a simple path of progress.[31] However, in the 1930s economists began developing models of non-renewable resource management (see Hotelling's rule)[32] and the sustainability of welfare in an economy that uses non-renewable resources (Hartwick's rule).[33]
Ecology had now gained general acceptance as a scientific discipline, and many concepts vital to sustainability were being explored. These included: the interconnectedness of all living systems in a single living planetary system, the biosphere; the importance of natural cycles (of water, nutrients and other chemicals, materials, waste); and the passage of energy through trophic levels of living systems.[34]
See also: Environmentalism and Environmental science. Following the deprivations of the great depression and World War II the developed world entered a new period of escalating growth, a post-1950s "great acceleration ... a surge in the human enterprise that has emphatically stamped humanity as a global geophysical force."[35] A gathering environmental movement pointed out that there were environmental costs associated with the many material benefits that were now being enjoyed. Innovations in technology (including plastics, synthetic chemicals, nuclear energy) and the increasing use of fossil fuels, were transforming society. Modern industrial agriculture—the "Green Revolution"—was based on the development of synthetic fertilizers, herbicides and pesticides which had devastating consequences for rural wildlife, as documented by American marine biologist, naturalist and environmentalist Rachel Carson in Silent Spring (1962).
In 1956, American geoscientist M. King Hubbert's peak oil theory predicted an inevitable peak of oil production, first in the United States (between 1965 and 1970), then in successive regions of the world—with a global peak expected thereafter.[36] In the 1970s environmentalism's concern with pollution, the population explosion, consumerism and the depletion of finite resources found expression in Small Is Beautiful, by British economist E. F. Schumacher in 1973, and The Limits to Growth published by the global think tank, the Club of Rome, in 1975.
See also: Sustainability and Sustainable development. Environmental problems were now becoming global in scale. The 1973 and 1979 energy crises demonstrated the extent to which the global community had become dependent on a nonrenewable resource; President Carter in his State of the Union Address called on Americans to "Conserve energy. Eliminate waste. Make 1980 indeed a year of energy conservation."[37] While the developed world was considering the problems of unchecked development the developing countries, faced with continued poverty and deprivation, regarded development as essential to raise the living standards of their peoples.[38] In 1980 the International Union for Conservation of Nature had published its influential World Conservation Strategy,[39] followed in 1982 by its World Charter for Nature,[40] which drew attention to the decline of the world's ecosystems.
In 1987 the United Nation's World Commission on Environment and Development (the Brundtland Commission), in its report Our Common Future suggested that development was acceptable, but it must be sustainable development that would meet the needs of the poor while not increasing environmental problems. Humanity's demand on the planet has more than doubled over the past 45 years as a result of population growth and increasing individual consumption. In 1961 almost all countries in the world had more than enough capacity to meet their own demand; by 2005 the situation had changed radically with many countries able to meet their needs only by importing resources from other nations.[41] A move toward sustainable living by increasing public awareness and adoption of recycling, and renewable energies emerged. The development of renewable sources of energy in the 1970s and '80s, primarily in wind turbines and photovoltaics and increased use of hydroelectricity, presented some of the first sustainable alternatives to fossil fuel and nuclear energy generation, the first large-scale solar and wind power plants appearing during the 1980s and '90s.[42] [43] Also at this time many local and state governments in developed countries began to implement small-scale sustainability policies.[44]
Through the work of climate scientists in the IPCC there is increasing global awareness of the threat posed by global climate change, produced largely by the burning of fossil fuels. In March 2009 the Copenhagen Climate Council, an international team of leading climate scientists, issued a strongly worded statement:"The climate system is already moving beyond the patterns of natural variability within which our society and economy have developed and thrived. These parameters include global mean surface temperature, sea-level rise, ocean and ice sheet dynamics, ocean acidification, and extreme climatic events. There is a significant risk that many of the trends will accelerate, leading to an increasing risk of abrupt or irreversible climatic shifts."[45]
Ecological economics now seek to bridge the gap between ecology and traditional neoclassical economics:[46] [47] it provides an inclusive and ethical economic model for society. A plethora of new concepts to help implement and measure sustainability are becoming more widely accepted including the car-free movement, smart growth (more sustainable urban environments), life cycle assessment (the cradle to cradle analysis of resource use and environmental impact over the life cycle of a product or process), ecological footprint analysis, green building, dematerialization (increased recycling of materials), decarbonisation (removing dependence on fossil fuels) and much more.[48]
The work of Bina Agarwal and Vandana Shiva amongst many others, has brought some of the cultural wisdom of traditional, sustainable agrarian societies into the academic discourse on sustainability, and also blended that with modern scientific principles.[49] In 2009 the Environmental Protection Agency of the United States determined that greenhouse gases "endanger public health and welfare" of the American people by contributing to climate change and causing more heat waves, droughts and flooding, and threatening food and water supplies.[50] Between the years 2016 and 2018, the United States saw an increase in 5.7% of the annual average fine particulate matter, which aids in quantifying ambient air quality.[51] Rapidly advancing technologies now provide the means to achieve a transition of economies, energy generation, water and waste management, and food production towards sustainable practices using methods of systems ecology and industrial ecology.[52] [53]