Reductionism is any of several related philosophical ideas regarding the associations between phenomena which can be described in terms of simpler or more fundamental phenomena. It is also described as an intellectual and philosophical position that interprets a complex system as the sum of its parts.[1]
The Oxford Companion to Philosophy suggests that reductionism is "one of the most used and abused terms in the philosophical lexicon" and suggests a three-part division:[2]
Reductionism can be applied to any phenomenon, including objects, problems, explanations, theories, and meanings.[3] [4]
For the sciences, application of methodological reductionism attempts explanation of entire systems in terms of their individual, constituent parts and their interactions. For example, the temperature of a gas is reduced to nothing beyond the average kinetic energy of its molecules in motion. Thomas Nagel and others speak of 'psychophysical reductionism' (the attempted reduction of psychological phenomena to physics and chemistry), and 'physico-chemical reductionism' (the attempted reduction of biology to physics and chemistry). In a very simplified and sometimes contested form, reductionism is said to imply that a system is nothing but the sum of its parts.
However, a more nuanced opinion is that a system is composed entirely of its parts, but the system will have features that none of the parts have (which, in essence is the basis of emergentism). "The point of mechanistic explanations is usually showing how the higher level features arise from the parts."
Other definitions are used by other authors. For example, what John Polkinghorne terms 'conceptual' or 'epistemological' reductionism is the definition provided by Simon Blackburn[5] and by Jaegwon Kim:[6] that form of reductionism which concerns a program of replacing the facts or entities involved in one type of discourse with other facts or entities from another type, thereby providing a relationship between them. Richard Jones distinguishes ontological and epistemological reductionism, arguing that many ontological and epistemological reductionists affirm the need for different concepts for different degrees of complexity while affirming a reduction of theories.
The idea of reductionism can be expressed by "levels" of explanation, with higher levels reducible if need be to lower levels. This use of levels of understanding in part expresses our human limitations in remembering detail. However, "most philosophers would insist that our role in conceptualizing reality [our need for a hierarchy of "levels" of understanding] does not change the fact that different levels of organization in reality do have different 'properties'."
Reductionism does not preclude the existence of what might be termed emergent phenomena, but it does imply the ability to understand those phenomena completely in terms of the processes from which they are composed. This reductionist understanding is very different from ontological or strong emergentism, which intends that what emerges in "emergence" is more than the sum of the processes from which it emerges, respectively either in the ontological sense or in the epistemological sense.[7]
Richard Jones divides ontological reductionism into two: the reductionism of substances (e.g., the reduction of mind to matter) and the reduction of the number of structures operating in nature (e.g., the reduction of one physical force to another). This permits scientists and philosophers to affirm the former while being anti-reductionists regarding the latter.[8]
Nancey Murphy has claimed that there are two species of ontological reductionism: one that claims that wholes are nothing more than their parts; and atomist reductionism, claiming that wholes are not "really real". She admits that the phrase "really real" is apparently senseless but she has tried to explicate the supposed difference between the two.[9]
Ontological reductionism denies the idea of ontological emergence, and claims that emergence is an epistemological phenomenon that only exists through analysis or description of a system, and does not exist fundamentally.[10]
In some scientific disciplines, ontological reductionism takes two forms: token-identity theory and type-identity theory.[11] In this case, "token" refers to a biological process.[12]
Token ontological reductionism is the idea that every item that exists is a sum item. For perceivable items, it affirms that every perceivable item is a sum of items with a lesser degree of complexity. Token ontological reduction of biological things to chemical things is generally accepted.
Type ontological reductionism is the idea that every type of item is a sum type of item, and that every perceivable type of item is a sum of types of items with a lesser degree of complexity. Type ontological reduction of biological things to chemical things is often rejected.
Michael Ruse has criticized ontological reductionism as an improper argument against vitalism.[13]
In a biological context, methodological reductionism means attempting to explain all biological phenomena in terms of their underlying biochemical and molecular processes.[14]
Anthropologists Edward Burnett Tylor and James George Frazer employed some religious reductionist arguments.[15]
Theory reduction is the process by which a more general theory absorbs a special theory. It can be further divided into translation, derivation, and explanation.[16] For example, both Kepler's laws of the motion of the planets and Galileo's theories of motion formulated for terrestrial objects are reducible to Newtonian theories of mechanics because all the explanatory power of the former are contained within the latter. Furthermore, the reduction is considered beneficial because Newtonian mechanics is a more general theory—that is, it explains more events than Galileo's or Kepler's. Besides scientific theories, theory reduction more generally can be the process by which one explanation subsumes another.
In mathematics, reductionism can be interpreted as the philosophy that all mathematics can (or ought to) be based on a common foundation, which for modern mathematics is usually axiomatic set theory. Ernst Zermelo was one of the major advocates of such an opinion; he also developed much of axiomatic set theory. It has been argued that the generally accepted method of justifying mathematical axioms by their usefulness in common practice can potentially weaken Zermelo's reductionist claim.[17]
Jouko Väänänen has argued for second-order logic as a foundation for mathematics instead of set theory,[18] whereas others have argued for category theory as a foundation for certain aspects of mathematics.[19] [20]
The incompleteness theorems of Kurt Gödel, published in 1931, caused doubt about the attainability of an axiomatic foundation for all of mathematics. Any such foundation would have to include axioms powerful enough to describe the arithmetic of the natural numbers (a subset of all mathematics). Yet Gödel proved that, for any consistent recursively enumerable axiomatic system powerful enough to describe the arithmetic of the natural numbers, there are (model-theoretically) true propositions about the natural numbers that cannot be proved from the axioms. Such propositions are known as formally undecidable propositions. For example, the continuum hypothesis is undecidable in the Zermelo–Fraenkel set theory as shown by Cohen.
Reductionist thinking and methods form the basis for many of the well-developed topics of modern science, including much of physics, chemistry and molecular biology. Classical mechanics in particular is seen as a reductionist framework. For instance, we understand the solar system in terms of its components (the sun and the planets) and their interactions.[21] Statistical mechanics can be considered as a reconciliation of macroscopic thermodynamic laws with the reductionist method of explaining macroscopic properties in terms of microscopic components, although it has been argued that reduction in physics 'never goes all the way in practice'.[22]
The role of reduction in computer science can be thought as a precise and unambiguous mathematical formalization of the philosophical idea of "theory reductionism". In a general sense, a problem (or set) is said to be reducible to another problem (or set), if there is a computable/feasible method to translate the questions of the former into the latter, so that, if one knows how to computably/feasibly solve the latter problem, then one can computably/feasibly solve the former. Thus, the latter can only be at least as "hard" to solve as the former.
Reduction in theoretical computer science is pervasive in both: the mathematical abstract foundations of computation; and in real-world performance or capability analysis of algorithms. More specifically, reduction is a foundational and central concept, not only in the realm of mathematical logic and abstract computation in computability (or recursive) theory, where it assumes the form of e.g. Turing reduction, but also in the realm of real-world computation in time (or space) complexity analysis of algorithms, where it assumes the form of e.g. polynomial-time reduction.
See main article: Free will. Philosophers of the Enlightenment worked to insulate human free will from reductionism. Descartes separated the material world of mechanical necessity from the world of mental free will. German philosophers introduced the concept of the "noumenal" realm that is not governed by the deterministic laws of "phenomenal" nature, where every event is completely determined by chains of causality. The most influential formulation was by Immanuel Kant, who distinguished between the causal deterministic framework the mind imposes on the world—the phenomenal realm—and the world as it exists for itself, the noumenal realm, which, as he believed, included free will. To insulate theology from reductionism, 19th century post-Enlightenment German theologians, especially Friedrich Schleiermacher and Albrecht Ritschl, used the Romantic method of basing religion on the human spirit, so that it is a person's feeling or sensibility about spiritual matters that comprises religion.[23]
Most common philosophical understandings of causation involve reducing it to some collection of non-causal facts. Opponents of these reductionist views have given arguments that the non-causal facts in question are insufficient to determine the causal facts.[24]
Alfred North Whitehead's metaphysics opposed reductionism. He refers to this as the "fallacy of the misplaced concreteness". His scheme was to frame a rational, general understanding of phenomena, derived from our reality.
An alternative term for ontological reductionism is fragmentalism,[25] often used in a pejorative sense.[26] In cognitive psychology, George Kelly developed "constructive alternativism" as a form of personal construct psychology and an alternative to what he considered "accumulative fragmentalism". For this theory, knowledge is seen as the construction of successful mental models of the exterior world, rather than the accumulation of independent "nuggets of truth".[27] Others argue that inappropriate use of reductionism limits our understanding of complex systems. In particular, ecologist Robert Ulanowicz says that science must develop techniques to study ways in which larger scales of organization influence smaller ones, and also ways in which feedback loops create structure at a given level, independently of details at a lower level of organization. He advocates and uses information theory as a framework to study propensities in natural systems.[28] The limits of the application of reductionism are claimed to be especially evident at levels of organization with greater complexity, including living cells,[29] neural networks (biology), ecosystems, society, and other systems formed from assemblies of large numbers of diverse components linked by multiple feedback loops.[30]