In the previous post, I explained the early attempts to derive the Born Rule for the MWI. These attempts required assumptions for which no justification was given; as a result, critics of the MWI pointed to the lack of justification for the Born Rule as a major weakness of the interpretation.
MWI supporters often had to resort to simply postulating the Born Rule as an additional law of physics. That is not as good as a derivation, which would be a great advantage for the MWI, but it at least puts the MWI on the same footing as most other interpretations. However, it is by no means clear that it is legitimate to do that, either. Many people think that branch-counting (or some form of observer-counting) must be the basis for probabilities in an MWI, as Graham had suggested. Since branch-counting gives the wrong probabilities (as Graham failed to realize), a critic might argue that experiments (which confirm the Born rule) show the MWI must be false.
Thus, MWI supporters were forced to argue that branch-counting did not, in fact, matter. The MWI still had supporters due to its mathematical simplicity and elegance, but when it came to the Born Rule, it was in a weak position.
In the famous Everett FAQ of 1995, Price cited the old 'infinite measurements frequency operator' argument. That was my own first encounter with the problem of deriving the Born Rule for the MWI, and despite being an MWI supporter, the finite-number-of measurements hole in the infinite-measurements argument was immediately obvious to me.
5) The decision-theoretic approach to deriving the Born Rule
In 1999, David Deutsch created a new approach to deriving the Born Rule for the MWI, based on decision theory. He wrote "Previous attempts ... applied only to infinite sets of measurements (which do not occur in nature), and not to the outcomes of individual measurements (which do). My method is to analyse the behaviour of a rational decision maker who is faced with decisions involving the outcomes of future quantum-mechanical measurements. I shall prove that if he does not assume [the Born Rule], or any other probabilistic postulate, but does believe the rest of quantum theory, he necessarily makes decisions as if [the Born Rule] were true."
Deutsch's approach quickly attracted both supporters and critics. David Wallace came out with a series of papers that defended, simplified and built on the decision theory approach, which is now known as the Deutsch-Wallace approach.
Deutch's derivation contained an implicit assumption, which Wallace made explicit, and called 'measurement neutrality'. Basically, it means that the details of how a measurement is made don't matter. For example, if a second measurement is made along with the first, it is assumed that the probabilities for the outcomes of the first won't be affected. This implies that unitary transformations, which preserve the amplitudes, don't matter. That implies 'equivalence', which states that two branches of equal amplitudes have equal probabilities, and which is essentially equivalent to the Born Rule. The Born Rule is then derived from 'equivalence' using simple assumptions cast in the language of decision theory.
Wallace acknowledged that 'measurement neutrality' was controversial, admitting "The reasons why we treat the state/observable description as complete are not independent of the quantum probability rule." Indeed, if probabilities depend on something other than amplitudes, then clearly they can change under unitary transformations.
So he offered a direct defense of the 'equivalence' assumption, which formed the basis of the paper that was for a long time considered the best statement of the DW approach, certainly as of the 2007 conferences. New Scientist magazine proclaimed that his derivation of the Born Rule in the MWI was "rigorous" and was forcing people to take the MWI seriously.
His basic argument was that things that the person making a decision doesn't care about won't matter. This included the number of sub-branches, but he also took care to argue that the number of sub-branches can't matter because it is not well-defined.
Consider Albert's hypothetical fatness rule, in which probabilities are proportional both to the squared amplitudes and to the observer's mass. This obviously violates 'equivalence'. According to Wallace's argument, the decider should ignore his mass unless it comes into play for the decision, so that is impossible. But it is a circular argument; the decider should care about his mass if it in fact affects the probabilities.
My critique of Wallace's approach is presented in more detail here, where I also cover his more recent paper.
In his 2009 paper, Wallace takes a different approach. Perhaps recognizing that assuming 'equivalence' is practically the same as just assuming the Born Rule, he makes some other assumptions instead, couched in the language of decision theory, which allow him to derive 'equivalence'. The crucial new assumption is what he calls 'diachronic consistency'. In addition to consistency of desires over time, it contains the assumption of conservation of measure as a function of time, which there is no justification to assume. Of course, the classical version of diachronic consistency is unproblematic, and only a very careful reading of the paper would reveal the important difference if it were not for the fact that Wallace helpfully notes that Albert's fatness rule violates it.
6) Zurek's envariance
W. Zurek attempted to derive the Born Rule using symmetries that he called 'envariance' or enviroment-assisted invariance. While interesting, his assumptions are not justified. The most important assumption is that all parts of a branch, and all observers in a branch, have the same "probability". Albert's fatness rule provides an obvious counterexample. I also note that a substate with no observers in it can not meaningfully be assigned any effective probability.
He uses this, together with another unjustified assumption that is similar to locality of probabilities, to obtain what Wallace called 'equivalence' and then the Born Rule from that. Because the latter part of Zurek's derivation is similar to the DW approach, the two approaches are sometimes considered similar, although Zurek does not invoke decision theory.
7) Hanson's Mangled Worlds
Robin Hanson came up with a radical new attempt to derive the Born Rule in 2003. It was similar to Graham's old world-counting proposal in that Hanson proposed to count sub-branches of the wavefunction as the basis for the probabilities.
The new element Hanson proposed was that the dynamics of sub-branches of small amplitude would be ruined, or 'mangled', by interference from larger sub-branches of the wavefunction. Thus, rather than simply count sub-branches, he would count only the ones with large enough amplitude to escape the 'mangling'.
Due to microscopic scattering events, a log-normal squared-amplitude distribution of sub-branches arises, as it is a random walk in terms of multiplication of the original squared-amplitude. Interference ('mangling') from large amplitude branches imposes a minimum amplitude cutoff. If the cutoff is in the right numerical range and is uniform for all branches, then due to the mathematical form of the log-normal function, the number of branches above the cutoff is proportional to the square of the original amplitude, yielding the Born Rule.
Unfortunately, this Mangled Worlds picture relies on many highly dubious assumptions; most importantly, the uniformity of the ‘mangling’ cutoff. Branches will not interfere much with other branches unless they are very similar, so there will be no uniformity; small-amplitude main branches will have smaller sub-branches but also smaller interference from large main branches and thus a smaller cutoff.
Even aside from that, while the idea of branch-counting has some appeal, it is clear that observer-counting (with computationalism, implementation-counting) is what is fundamentally of interest. Nonetheless, 'Mangled Worlds' is an interesting proposal, and is the inspiration for a possible approach to attempt to count implementations of computations for the MWI, which will be discussed in more detail in later posts. That does require some new physics though, in the form of random noise in the initial conditions which acts to provide the uniform cutoff scale that is otherwise not present.
In the next post, proposals for MWIs that include modifications of physics will be discussed.
Ontology & Quantum Mechanics
- ▼ September (6)