Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.
There exist different approaches to resolve this conceptual gap:
Research in quantum foundations is structured along these roads.
See main article: Quantum nonlocality.
Two or more separate parties conducting measurements over a quantum state can observe correlations which cannot be explained with any local hidden variable theory.[1] [2] Whether this should be regarded as proving that the physical world itself is "nonlocal" is a topic of debate,[3] [4] but the terminology of "quantum nonlocality" is commonplace. Nonlocality research efforts in quantum foundations focus on determining the exact limits that classical or quantum physics enforces on the correlations observed in a Bell experiment or more complex causal scenarios.[5] This research program has so far provided a generalization of Bell's theorem that allows falsifying all classical theories with a superluminal, yet finite, hidden influence.[6]
See main article: Quantum contextuality.
Nonlocality can be understood as an instance of quantum contextuality. A situation is contextual when the value of an observable depends on the context in which it is measured (namely, on which other observables are being measured as well). The original definition of measurement contextuality can be extended to state preparations and even general physical transformations.[7]
A physical property is epistemic when it represents our knowledge or beliefs on the value of a second, more fundamental feature. The probability of an event to occur is an example of an epistemic property. In contrast, a non-epistemic or ontic variable captures the notion of a “real” property of the system under consideration.
There is an on-going debate on whether the wave-function represents the epistemic state of a yet to be discovered ontic variable or, on the contrary, it is a fundamental entity.[8] Under some physical assumptions, the Pusey–Barrett–Rudolph (PBR) theorem demonstrates the inconsistency of quantum states as epistemic states, in the sense above.[9] Note that, in QBism[10] and Copenhagen-type[11] views, quantum states are still regarded as epistemic, not with respect to some ontic variable, but to one's expectations about future experimental outcomes. The PBR theorem does not exclude such epistemic views on quantum states.
Some of the counter-intuitive aspects of quantum theory, as well as the difficulty to extend it, follow from the fact that its defining axioms lack a physical motivation. An active area of research in quantum foundations is therefore to find alternative formulations of quantum theory which rely on physically compelling principles. Those efforts come in two flavors, depending on the desired level of description of the theory: the so-called Generalized Probabilistic Theories approach and the Black boxes approach.
See main article: Generalized probabilistic theories. Generalized Probabilistic Theories (GPTs) are a general framework to describe the operational features of arbitrary physical theories. Essentially, they provide a statistical description of any experiment combining state preparations, transformations and measurements. The framework of GPTs can accommodate classical and quantum physics, as well as hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation.[12] Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.[13]
L. Hardy introduced the concept of GPT in 2001, in an attempt to re-derive quantum theory from basic physical principles.[13] Although Hardy's work was very influential (see the follow-ups below), one of his axioms was regarded as unsatisfactory: it stipulated that, of all the physical theories compatible with the rest of the axioms, one should choose the simplest one.[14] The work of Dakic and Brukner eliminated this “axiom of simplicity” and provided a reconstruction of quantum theory based on three physical principles.[14] This was followed by the more rigorous reconstruction of Masanes and Müller.[15]
Axioms common to these three reconstructions are:
An alternative GPT reconstruction proposed by Chiribella et al.[16] [17] around the same time is also based on the
SA
A-B
TAB
SA
TAB
A
TAB,
\prime | |
T | |
AB |
SA
B
The use of purification to characterize quantum theory has been criticized on the grounds that it also applies in the Spekkens toy model.[18]
To the success of the GPT approach, it can be countered that all such works just recover finite dimensional quantum theory. In addition, none of the previous axioms can be experimentally falsified unless the measurement apparatuses are assumed to be tomographically complete.
See main article: Categorical quantum mechanics.
Categorical Quantum Mechanics (CQM) or Process Theories are a general framework to describe physical theories, with an emphasis on processes and their compositions.[19] It was pioneered by Samson Abramsky and Bob Coecke. Besides its influence in quantum foundations, most notably the use of a diagrammatic formalism, CQM also plays an important role in quantum technologies, most notably in the form of ZX-calculus. It also has been used to model theories outside of physics, for example the DisCoCat compositional natural language meaning model.
See main article: Quantum nonlocality.
In the black box or device-independent framework, an experiment is regarded as a black box where the experimentalist introduces an input (the type of experiment) and obtains an output (the outcome of the experiment). Experiments conducted by two or more parties in separate labs are hence described by their statistical correlations alone.
From Bell's theorem, we know that classical and quantum physics predict different sets of allowed correlations. It is expected, therefore, that far-from-quantum physical theories should predict correlations beyond the quantum set. In fact, there exist instances of theoretical non-quantum correlations which, a priori, do not seem physically implausible.[20] [21] [22] The aim of device-independent reconstructions is to show that all such supra-quantum examples are precluded by a reasonable physical principle.
The physical principles proposed so far include no-signalling,[22] Non-Trivial Communication Complexity,[23] No-Advantage for Nonlocal computation,[24] Information Causality,[25] Macroscopic Locality,[26] and Local Orthogonality.[27] All these principles limit the set of possible correlations in non-trivial ways. Moreover, they are all device-independent: this means that they can be falsified under the assumption that we can decide if two or more events are space-like separated. The drawback of the device-independent approach is that, even when taken together, all the afore-mentioned physical principles do not suffice to single out the set of quantum correlations.[28] In other words: all such reconstructions are partial.
See main article: Interpretations of quantum mechanics.
An interpretation of quantum theory is a correspondence between the elements of its mathematical formalism and physical phenomena. For instance, in the pilot wave theory, the quantum wave function is interpreted as a field that guides the particle trajectory and evolves with it via a system of coupled differential equations. Most interpretations of quantum theory stem from the desire to solve the quantum measurement problem.
In an attempt to reconcile quantum and classical physics, or to identify non-classical models with a dynamical causal structure, some modifications of quantum theory have been proposed.
Collapse models posit the existence of natural processes which periodically localize the wave-function.[29] Such theories provide an explanation to the nonexistence of superpositions of macroscopic objects, at the cost of abandoning unitarity and exact energy conservation.
In Sorkin's quantum measure theory (QMT), physical systems are not modeled via unitary rays and Hermitian operators, but through a single matrix-like object, the decoherence functional.[30] The entries of the decoherence functional determine the feasibility to experimentally discriminate between two or more different sets of classical histories, as well as the probabilities of each experimental outcome. In some models of QMT the decoherence functional is further constrained to be positive semidefinite (strong positivity). Even under the assumption of strong positivity, there exist models of QMT which generate stronger-than-quantum Bell correlations.[31]
The formalism of process matrices starts from the observation that, given the structure of quantum states, the set of feasible quantum operations follows from positivity considerations. Namely, for any linear map from states to probabilities one can find a physical system where this map corresponds to a physical measurement. Likewise, any linear transformation that maps composite states to states corresponds to a valid operation in some physical system. In view of this trend, it is reasonable to postulate that any high-order map from quantum instruments (namely, measurement processes) to probabilities should also be physically realizable.[32] Any such map is termed a process matrix. As shown by Oreshkov et al.,[32] some process matrices describe situations where the notion of global causality breaks.
The starting point of this claim is the following mental experiment: two parties, Alice and Bob, enter a building and end up in separate rooms. The rooms have ingoing and outgoing channels from which a quantum system periodically enters and leaves the room. While those systems are in the lab, Alice and Bob are able to interact with them in any way; in particular, they can measure some of their properties.
Since Alice and Bob's interactions can be modeled by quantum instruments, the statistics they observe when they apply one instrument or another are given by a process matrix. As it turns out, there exist process matrices which would guarantee that the measurement statistics collected by Alice and Bob is incompatible with Alice interacting with her system at the same time, before or after Bob, or any convex combination of these three situations.[32] Such processes are called acausal.