The Everett Interpretation of Quantum Mechanics: Philosophy Faculty, Oxford University, July 1921, 2007 Edited by Andy Ross
Quantum mechanics has been with us for over 80 years and still
there is no consensus on what it means. The Everett interpretation has now been
with us for 50 of those years and is now arguably the simplest most credible
explanation we have of the world. It requires no additional assumptions, no
conceptual divisions between observers and observed, applies to the universe as
a whole, and naturally explains probabilities arising from quantum mechanics. Abstracts The Everett Interpretation: 50 years on Simon Saunders The Everett interpretation of quantum mechanics as presently construed breaks down into an account of structure and ontology, a theory of evidence, and a theory of probability. I present some background in the context of Everett's original paper. Decoherence and ontology David Wallace Decoherence is central to the Everett interpretation. Within the unitarily evolving quantum state, decoherence creates autonomous, stable, quasiclassical systems which are approximately isomorphic to classical universes. These are the many worlds of the Everett interpretation – worlds in the sense of approximately isolated, approximately classical chunks of a larger reality. To see this requires only the mathematics of decoherence theory and the right understanding of higherorder ontology. I claim that, given decoherence, unitary quantum mechanics is and must be a manyworlds theory. Can the world be only wavefunction? Tim Maudlin A common understanding of the many worlds theory holds that it is ontologically monistic, postulating only the existence of the wavefunction and nothing else. But it is hard to see how such an austere ontology can make comprehensible contact with the experimental facts. Two dogmas about quantum mechanics Jeffrey Bub and Itamar Pitowsky We argue for an informationtheoretic formulation of quantum mechanics. We consider a ‘no cloning’ principle as the crucial principle demarcating classical from nonclassical information theories. We show that ‘no cloning’ entails that any measurement must sometimes irreversibly change the state of the measured system. So no complete dynamical account of measurement is possible if the ‘no cloning’ principle is true. We reject the dogmas that the process of measurement should always be open to a complete dynamical analysis and that the quantum state is analogous to the classical state as a representation of physical reality. We show that cloning is possible in both Bohmian and Everettian versions of quantum mechanics. Everett and evidence Wayne Myrvold and Hilary Greaves The Everett interpretation must do justice to statistical evidence in which observed relative frequencies closely match calculated probabilities in a theory. Since, on the Everett interpretation, all outcomes with nonzero amplitude are actualized on different branches, it is not obvious that sense can be made of ascribing probabilities to outcomes of experiments. It is incumbent on the Everettian either to make sense of such probabilities or to explain how the usual statistical analysis of experimental results continues to count as evidence for quantum mechanics. We give an account of theory confirmation that applies to branchinguniverse theories but does not presuppose their correctness. Probability in the Everett picture David Z. Albert I try to sharpen a number of worries about the possibility of making sense of quantummechanical probabilities in the Everett picture. Timesymmetric quantum mechanics and the manyworlds interpretation Lev Vaidman I introduce a formalism that describes a quantum system at a given time not only by the standard, forward evolving wave function, but also by a backward evolving quantum state. This changes the picture of a branching tree of worlds. Instead of a tree that starts from a single root state and splits at every quantum measurement, future measurements split the worlds retroactively. Ideal quantum measurements yield identical forward and backward evolving quantum states. For macroscopic objects, splitting happens at quantum measurement. But quantum objects are described by backward evolving quantum states defined by future measurement that split the world retroactively. Generalizing Everett's quantum mechanics for quantum cosmology James B. Hartle Everett took seriously the idea that quantum mechanics could apply to the universe. His ideas have since been extended to create the modern synthesis called decoherent histories or consistent histories. This is a quantum framework adequate for cosmology when gross quantum fluctuations in the geometry of spacetime can be neglected. A further generalization is needed to incorporate quantum gravity. I review a generalized quantum mechanics of cosmological spacetime geometry. Some remarkable implications of probabilities without time Andreas Albrecht I consider the ambiguity in quantum gravity that arises from the choice of clock. This ambiguity leads to an absolute lack of predictability, a complete absence of physical laws. I consider an approach that could lead to a certain amount of predictability in physics. Explaining probability Simon Saunders In the Everett interpretation of quantum mechanics, physical probability is identified with categorical physical properties and relations. I focus on the place of uncertainty in EQM, the semantics of uncertainty, and the explanation of the epistemology of probability. Pilotwave theory: Everett in denial? Antony Valentini I reply to claims that the pilotwave theory of de Broglie and Bohm is really a
manyworlds theory with a superfluous configuration appended to one of the
worlds. I show that from the perspective of pilotwave theory, many worlds are
an illusion. Press reactions Edited by Andy Ross Parallel universes make quantum sense New Scientist, September 21, 2007 The days when physicists could ignore the concept of parallel
universes may have come to an end. David Deutsch at the University of Oxford and
colleagues have shown that key equations of quantum mechanics arise from the
mathematics of parallel universes. "This work will go down as one of the most
important developments in the history of science," says Andy Albrecht, a
physicist at the University of California at Davis. Parallel universes really do exist, say theorists
By Roger Highfield Quantum mechanics describes the strange things that happen in the
subatomic world. By one interpretation, nothing at the subatomic scale can
really be said to exist until it is observed. Until then, particles occupy
nebulous "superposition" states, in which they can have simultaneous "up" and
"down" spins, or appear to be in different places at the same time. Parallel universes exist — study Parallel universes really do exist, according to a mathematical
discovery by Oxford scientists. The parallel universe theory, first proposed in
1950 by the US physicist Hugh Everett, helps explain mysteries of quantum
mechanics that have baffled scientists for decades, it is claimed. In Everett's
"many worlds" universe, every time a new physical possibility is explored, the
universe splits. Given a number of possible alternative outcomes, each one is
played out in its own universe. Technical Details The Everett Interpretation: 50 years on
Simon Saunders Edited by Andy Ross (with apologies to Simon) The problem of measurement The problem of measurement is the problem of reconciling two kinds of dynamical evolution postulated in quantum mechanics. The first kind is deterministic and incorporates spacetime symmetries. It is the unitary dynamics. The second is indeterministic, apparently unrelated to any spacetime symmetry, without any dynamical structure. It is the quantum jump or the collapse of the wavefunction, onto one of a large number of wavefunctions that were previously superposed, when a measurement is performed. The projection postulate is that on collapse, at least in the case of an experiment where the quantity measured can be measured again on the same system, a new quantummechanical state is introduced. The dynamical variable that the experiment is designed to measure is assigned a value in this way. Physicists have historically tried to see these measurement postulates as a reflection of some sort of philosophical limitation to physical theorizing or the expression of laws. If so, the measurement postulates need not signify anything wrong with quantum mechanics. We suppose that a satisfactory solution of the problem of measurement can be put into creeds: (1) The problem of measurement should be solved by clear and simple reasoning that can at least schematically be stated in nonrelativistic quantum mechanics and can at least schematically be applied to the universe as a whole. (2) The solution should be applicable to relativistic quantum theory as well and specifically to the standard model. (3) There should be no special status in the interpretation for the observer, experiment, subsystem, or environment, unless questions of evidence or beliefs are explicitly invoked. Otherwise, such entities should be modeled as physical systems or subsystems or physical processes, just like any other. (4) It is in principle legitimate to view the wavefunction as physically real and as applicable to the universe as a whole. Everettian quantum mechanics (EQM) as it was originally formulated met (2), (3) and (4), and went some way to meeting (1). But the quantum theory that emerges, purged of the measurement postulates, is fantastical. It only avoids the measurement problem insofar as it describes all physically possible outcomes to such a process as physically real. Everett's relative states Everett showed that at the macroscopic level the development of a single component of the wavefunction into a superposition will in a certain sense be invisible. For suppose we have a unitary dynamical evolution taking the total system from an initial state Ψ_{0} to a final state Ψ_{t}. Suppose that the spin system in the initial state ↑> couples to the apparatus so as to yield the outcome spinup with certainty, and likewise when the initial state is ↓> the outcome is spindown with certainty. Then the dynamics is: Ψ_{0} =  ready > × ↑> → Ψ_{t} =  spinup > × ↑> Ψ_{0} =  ready > × ↓> → Ψ_{t} =  spindown > × ↓> In either case, no measurement postulate is needed. The outcome can be predicted with certainty merely from the unitary dynamics. If  ready >,  spinup > and  spindown > denote the wave function not just of the apparatus but of the environment as well, then these states describe ordinary macroscopic states of affairs. Now consider the result if the spin system is initially prepared in a superposition of those two states, say c ↑> + d ↓>. This is supposed to yield trouble. But if we consider the final state as dictated by the same unitary evolution — Ψ_{0} =  ready > × (c ↑> + d ↓>) → Ψ_{t} = c  spinup > × ↑> + d  spindown > × ↓> (S) — then either of the states  spinup > and  spindown > likewise describes an ordinary state of affairs, in each of which a definite outcome is recorded, just as before. In a series of repetitions of the experiment, with the recording instrument storing the outcomes one by one, the superposition is again a superposition of states each of which describes an ordinary state of affairs, a sequence of outcomes, a definite record of statistics. The superposition itself cannot be encoded in a record in any branch in this way. It is in this sense invisible. Everett said that relative to the state  spinup > there is a relative state ↑> of the spin system. This provided a way of presenting the basic ideas, without talking explicitly of many worlds. Everett called it the relativestate formulation of quantum mechanics. Everett had little more to say than this. His contribution was in a way rather minimal. Everett pointed out that branching would be invisible so long as everything was branching together. But there is a certain difficulty. What are Everett's states describing the macroscopic, and why are those states the right ones to choose as defining relative states? What is the natural or preferred basis, with respect to which the universe is in a superposition? This is the preferred basis problem. The interpretation faced another problem. It was intended to make sense of the unitary, covariant, and deterministic dynamics. How to reconcile this with the probabilistic interpretation of the theory? As conventionally formulated, probabilities only come in to quantum mechanics with the measurement postulates. Thus, it is only the measurement postulates that tells you a superposition like (S) means that one of the states  spinup > × ↑> or  spindown > × ↓> results, with probabilities  c ^{2} and  d ^{2} respectively. If the superposition actually remains, in what sense does either state occur with some probability? Decoherence theory The basis to be used in defining the branching structure is only effective; it should not matter to the macroscopic description if it is tweaked this way or that. Branching is a real dynamical structure to the universal state. It is decoherence. A basis adapted to this dynamical structure is the one to make those patterns clear. But it is defined only for all practical purposes (FAPP). The philosophy is an obvious one, if classical worlds are higherorder ontology, structures in the universal state. They arise through a coarsegraining of an underlying physics that does not have to be known exactly. Classicality for all practical purposes was symptomatic of a failure of realism, but from an Everettian point of view that is simply a mistake. The fundamental theory itself must be defined precisely, but the classical is an empirical consequence of the theory. And in extracting the empirical consequences of a physical theory, everyone is agreed that approximations can and should play a fundamental role. The underlying philosophy was that superpositions of states describing different macroscopic properties were somehow forbidden. Another method for defining decoherence was the consistent histories theory. The mathematical tool is the histories formalism itself (a history space), and the criterion of consistency (or decoherence). The quasiclassical history space yields the structure of the universal state that we have so far been concerned with: the system of branching and approximately classical worlds. Why not suppose only one of these histories is real? If there is only one world, the universal state has only the meaning of a probability measure on the history space, when what exists is the single history. Why not try to describe it precisely? For a start the consistency condition had better be precisely satisfied. One is a long way from the perspective of classicality as an effective theory. It is different if the ultimate reality is the universal state. In that case a history space concerns only an effective level of description of the structure of the state, better or worse suited to extracting useful phenomenological equations. The structure itself is emergent, imprecise at its boundaries and in its minutiae, like galaxies and planetary systems. Worlds in the Everett interpretation are really like worlds, planetary systems that are tightly bound together, but only weakly coupled to other words, and systems without precise borders or edges. Probability Branching is only effective, so too is quantum probability. Probabilistic events, according to the Everett interpretation, occur when branching occurs, when an element of the decoherence basis unitarily evolves into a superposition of such elements. An objection to the Everett interpretation was that if branching really occurs then there is a natural alternative measure over branches to the Born rule: that for which all branches are equiprobable. But if branching only occurs on decoherence, then there is no such measure that applies to branches at the level at which they themselves are defined. There are fat branches and thin ones, as given by the Born rule; there is no number of branches which are fat, no number which are thin. This is the first of three crucial questions concerning probability. They are: (i) What of branches with records of anomalous statistics? (ii) Is there any place for epistemic uncertainty in the face of branching? (iii) How, if at all, is the Born rule to be justified? Deutsch derived the Born rule from certain symmetry arguments and appeal to certain axioms of decision theory. The general idea is this: let rational agents express likelihood relations among quantum experiments M, N, ..., whose outcomes are sets of events E, F, G, ..., yield dividends whose utility is selected by the agent at will. Let EM ≥ FN mean that, in the agent's expectation it is at least as likely that E will happen given M as that F will happen given N. For an experiment, M, let E_{M} be the set of all possible outcomes, and let Ø be the empty set. Then an ordering of likelihoods is represented by a credence function Pr if Pr(ØM) = 0 and Pr(E_{M}M) = 1 If E and F are disjoint then Pr(E u FM) = Pr(EM) + Pr(FM) Pr(EM) ≥ Pr(FN) iff EM ≥ FN We suppose agents are rational insofar as they subscribe to the principles: Transitivity. If EM ≥ FN and FN ≥ GO, then EM ≥ GO. Transitivity requires that likelihoods are comparable. Separation. There exists some E and M such that EM is not null. Separation requires that some event is possible. Dominance.
If E is a subset of F, then FM ≥ EM for any M, with FM ~ EM iff E – F is
null, where an event E is null given M if A further principle introduces the usual Born probability or weight W_{M}(F) for outcome F on performance of experiment M: Equivalence. FM ~ EN if and only if W_{M}(F) = W_{M}(F). Equivalence is the principle that outcomes of equal weight have equal credence. We are interested in situations where there are enough experiments available so that decision theory can bite. We define a set M of quantum experiments to be rich provided that, for any positive real numbers w_{1}, ..., w_{n}, such that (w_{1} + ... + w_{n}) = 1, M includes a quantum experiment with n outcomes having weights w_{1}, ..., w_{n}. We can now state the representation theorem, due to Deutsch and Wallace. If the likelihood orderings of a rational agent satisfy Equivalence, then they are uniquely representable by a credence function Pr where Pr(EM) = W_{M}(F). If Equivalence can be viewed as a principle of rationality, the Everett interpretation is in good shape. The notion of objective probability has long troubled empiricists. Credence or subjective probability is in contrast perfectly clear. But just why credence should track chance can hardly be explained until we know what chance is. Their relation may have the irreducible status of a brute posit. In EQT it is enough if equal chances have equal credences. There is another aspect to the assessment of EQT as a probabilistic theory. It may be that one who believes EQT is true will match her credences to quantum mechanical weights, but how is one to update a prior probability measure (credence) over two or more competing theories in the face of the observed relative frequencies? Must one already believe that EQT is true, in order to deduce from the observed statistics that quantum mechanics is better confirmed than some rival? The Everett interpretation undermines so many common beliefs so as to threaten the very basis on which evidential claims for quantum mechanics are evaluated. This question returns us to (ii) in our list above, of whether there is any place for uncertainty in EQM. If the answer to (ii) is no, it might be argued that rational agents can have no notion of a likelihood relation either: if nothing is uncertain, how can any event be more likely than another? First, uncertainty is not needed for the representation theorem, which can just use relative weight. Nor is it needed for a (Bayesian) confirmation theory. Second, what is at stake is what our ordinary words actually mean, in a way that is dictated by use. If we go over to EQT, the new theory may or may not be consistent with what we were previously inclined to say. That plunges us back into philosophy. Granted that the Everett interpretation is a literalist construal of dynamical unitary evolution, it would be astonishing if a different realist interpretation of the theory were possible. If these claims are true, our best physical theory is telling us that we live in a branching universe. And that the measurement problem is solved.
AR (2012) All this is still deeply interesting work.
