Friday, July 31, 2009
Top 12 things to know about physics
But if you really want to understand QM, not fully, but well enough to understand the current state of understanding - not just to get some entertainment value out of it - you should know a lot about physics. I can't teach all that stuff here.
What I will do is give a few short sentences on each topic. I'll probably edit in some references.
Top 12 things to know about physics:
1. Classical mechanics
In classical (Newtonian) mechanics, there are particles (which can be points or be extended). Each has a position and velocity. There are equations that describe how these things change as a function of time, based on force laws such as Newton's famous law of gravitation.
Key concepts: Vector, position, velocity, acceleration, mass, force, momentum, angular momentum, potential energy, kinetic energy, conservation laws, force laws
2. The harmonic oscillator
A spring-mass system is a paradigmic example of a harmonic oscillator. The restoring force increases linearly with the distance from the equilibrium point, which leads to a sinusoidal motion if it's left to oscillate freely. This oscillation is more than a little similar to that in the next subject, waves. If there is damping friction, the oscillation decays exponentially, which can be represented mathematically as a complex (real + imaginary part) frequency.
Key concepts: Linear, nonlinear, frequency, amplitude, sinusiodal, complex exponential
3. Classical waves
Sound, water waves, electromagnetic waves - these are all disturbances of some kind of field, that propagate at a specific speed. A field is a function of position, such as air pressure, that can vary with time. Disturbances propagate locally - the neighboring regions are affected first.
Key concepts: Field, amplitude, phase, frequency, wavelength, local, Doppler effect, superposition, interference
4. Special relativity
You know all that stuff you just learned about classical mechanics? Ahem ... it's all wrong! Well, not all of it, but things are a bit different.
The speed of light is found to be the same relative to any observer, when measured by observers moving at different speeds. How can that be? Well, there's time dilation ... and length contraction ... that work together to produce that result. The speed of light is a cosmic speed limit. One important consequence is that if events are simultaneous as measured by observers moving at one velocity, they will not be if measured by observers at some other velocity.
Key concepts: E = m c^2, time dilation, length contraction, light cones, relative simultaneity, spacelike seperation, Lorentz invariance
5. Electromagnetism
Electromagnetic forces hold atoms together, and electromagnetic waves have many important applications. There are positive and negative charges. Like charges repel; opposite charges attract. The force decreases as the inverse square of the distance. Magnetism is a consequence of relativity.
Key concepts: Charge, electron, proton, electric field, magnetic field, EM wave
6. General relativity
OK! So, you finally have some grasp of special relativity. You've understood the 'twin paradox', the 'barn paradox', and how it is that observers moving relative to each other each think that the other guy's clock is slower.
But wait ... there's more! Space and time are flexible. Gravity is a bending of spacetime caused by mass. In extreme cases, you might not be able to think of a flat spacetime that is bent out of shape - you might have to build it by gluing pieces together. If mass is concentrated enough, it could collapse to a black hole, from which even light can't escape - you can think of future time as pointing radially towards the center of it.
Closed timelike curves are solutions to the equations of GR in which the future loops around to become the past within those places - time travel? It has problems. There is no known way to make them, anyway.
Key concepts: Gravitational redshift, black hole, event horizon, gravitational wave, closed timelike curve, general coordinate invariance, wormhole, naked singularity
7. Cosmology
The universe as a whole is expanding. Looking back towards the past, it was smaller and denser, apparently having originated in a state of very high density and temperature and much smaller size - the Big Bang. According to general relativity, it would have been a state of infinite density. Looking to the future, the universe will get larger and colder; long after the last star has evaporated, it appears that it will just keep expanding. If there's a cosmological constant, it will keep growing exponentially.
Key concepts: Hubble constant, Big Bang, singularity, redshift, cosmological constant, inflation
8. Entropy and Statistical Mechanics
Disorder (entropy) tends to increase on a microscopic level. As a result, to get anything accomplished, we need a continuous flow of more-ordered stuff going in, and we dump the less-order stuff going out. For example, sunlight in (high energy per photon (particle of light), an orderly concentration) and thermal radiation out (distributing the same energy over many more, less energetic photons, which is more random since there are more ways to do it).
Statistical mechanics allows us to consider the mechanics of large numbers of particles using probability distributions rather than trying to follow each particle individually. It explains the above tendency, obtaining irreversible average behavior from time-symmetric basic equations of motion.
The thermodynamic 'arrow of time' implies that the conditions near the beginning of the universe were much lower in entropy than a typical possible microstate, which is why the entropy has a lot of room to grow.
Key concepts: Entropy, arrow of time, configuration space, probability, large number of particles
9. Quantum mechanics
That's right; before delving into interpretation of QM, it helps to know a bit of QM first.
Experiments have shown that everything has wave-like characteristics. Particles such as electrons (and in principle anything made of them, such as people) can exhibit wave-like interference effects. Waves, such as electromagnetic waves, can appear to act as though made up of individual particles (photons in the EM case).
QM explains why matter doesn't collapse: Electrons are attracted to the protons in an atom's nucleus, but if they are confined to a small place such as the nucleus, they would have a short wavelength. They would need high energy for that, and if they had it, they would zoom out. In practice there is a balance between the electron wave resisting being squeezed to a small size, versus it being attracted to the nucleus. Also, electron waves can't occupy the same state as each other, so atoms with more electrons end up with shells of electrons further out, even though they have more protons pulling the electrons tight.
As I explained in my previous post, in QM, it is not correct to say that each particle is a wave in space. Instead, there is a joint wavefunction, which lives on configuration space. I will have more to say about it in other posts.
When a measurement is made, the outcome appears to be random, with the probabilities given by the Born Rule - proportional to the square of the wavefunction. Many observables have a discrete (quantized) set of possible outcomes.
Key concepts: Matter waves, spin, quantized outcomes, wavefunction, Pauli exclusion principle, eigenstates, eigenvalues, Hamiltonian, observable, Shrodinger Picture, Heisenberg Picture, Born Rule, measurement problem, Shrodinger's Cat, entanglement, decoherence, Bell's theorem, Uncertainty Principle
10. Quantum field theory (QFT)
QFT is the relativistic version of quantum mechanics. Instead of particles, there are fields. There's a small set of fields for every kind of particle - electrons, photons, etc. The wavefunctional lives on the space of configurations of these fields. QFT allows the creation or destruction of 'particles' because really there are no particles - quantized excitations of the fields play the role of particles.
There are problems with QFT. It is necessary to 'renormalize' the fields because interactions can produce infinite divergences, which must be subtracted out. This can sometimes be done by assuming a minimum length scale, then taking the limit as the scale goes to zero. Even so, some problems remain but can usually be ignored by using approximations.
Key concepts: wavefunctional, field configuration, spin, boson, fermion, antiparticle, locality, renormalization, false vacuum, gauge theory
11. Quantum gravity (QG)
Nobody has yet suceeded in bringing it all together, and until they do, we won't really know what's going on. But even so, there are many important tidbits to know regarding gravity and QM.
There is a for-many-practical-purposes minimum length scale in quantum gravity: Make a small enough black hole, and the wavelength becomes the same size as the event horizon. Add more mass, the event horizon gets bigger; less mass, and the wavelength does. Squeeze a wave that small, and you've added the mass (E = m c^2). This scale is called the Planck length. We don't know what's going on below that scale.
However, it should be noted that is unlikely that there is literally a discrete lattice of spatial points at that scale: The fact that photons from very distant sources don't show evidence of an energy-dependent difference in speed appears to rule out that simple model, and the fact that since the universe is expanding and has expanded by a huge factor in the past, the number of lattice points would need to be continually increasing to maintain spacing at that fixed scale, show that it would not be so simple as one might at first guess. There are approaches to quantum gravity which might work to give some sort of more complicate discrete structure at that scale.
If there _were_ really a regular fixed discrete lattice (which, again, there is no reason to believe except that it may be philosophically attractive to avoid an actual infinity of points), my _personal guess_ is that it would have to be at a scale so many orders of magnitude smaller (e.g. not just a few powers of 10 smaller, but perhaps billions of powers of 10 smaller) that its effects would be unobservable, and so the spacing in terms of Plank distance would be increasing as the universe expands but still remain too small to notice.
Many people think that the Planck scale explains the use of renormalization in QFT. Infinite renormalization doesn't make much sense, but with a finite "effective" minimum scale, there would be just a finite scale factor. Of course, no one knows how the details would work.
Another important thing to know about is Hawking radiation. As Hawking discovered, an event horizon produces radiation. This makes black holes evaporate - on the quantum level, this is important because it means that some form of the usual information-preserving quantum mechanics might apply after all; details of what went in are encoded in the radiation.
Also, the radiation has a thermal spectrum, allowing a temperature and an entropy to be assigned to black holes. The stability of these objects is re-explained in terms of them having a huge entropy. This implies that there is some new variable that can be distributed in many ways, such as different excitations of string modes at the center (see below) or something varying near the event horizon.
It is generally thought that a fixed volume of space has a finite number of degrees of freedom in QG - due to the minimum length, or more often based on the size of the boundary, due to the fact that black holes seem to have the highest possible entropy and it is proportional to the area of the event horizon. This affects some interpretations of QM.
QG might also forbid other weird features of GR, such as closed timelike curves.
On the cosmological level, if there is a cosmological constant, there will be Hawking radiation. In the past there appears to have been a similar large effect (inflation) but it was not a constant. In the deep future, the Hawking radiation might produce Boltzmann brains - randomly assembled brains. It's not something that would happen often but infinity is a long time and an exponentially growing universe is a big place. If over infinite time these greatly outnumber normal observers, as they would, the theory is inconsistent with our observations.
Also, if QM is applied to the equations of general relativity for a closed universe, the apparent result is that the wavefunction can not change as a function of time (total energy = zero, which leads to a static state in QM). This Wheeler-DeWitt equation is a rather controversial result.
It is not clear what the ontology of QG would be. Perhaps the wavefunctional lives on the space of possible geometries and field configurations on those.
QG is not renormalizable. String theory uses extended fundamental objects to avoid the infinities, but current work assumes a fixed background geometry, which is considered an approximation that is hard to get away from.
String theory allows many 'vacuum' solutions; our universe might decay from the current one to an uninhabitable one, perhaps helping explain the lack of Boltzmann brains (BBs) over the history of the universe. It has been suggested that the end stage of a small black hole would seed the decay of the vacuum. Other ways out of the BB problem are for baby universes to continually form (perhaps in black holes), or for there to be no cosmological constant.
Loop quantum gravity is another approach, that avoids a fixed background, but is less developed. Other approaches include the holographic universe, in which the boundary determines everything.
Key concepts: string theory, loop quantum gravity, Plank scale, Hawking radiation, Wheeler-DeWitt equation, problem of time, baby universes, Boltzmann brains, cosmological constant, decay of the vacuum, holographic principle, AdS-CFT correspondence
12. The anthropic principle
Our universe seems fine-tuned for life. If parameters such as the strength of the electrical force compared to the nuclear forces were just a little different, complex chemistry would not be possible. If there were no neutrinos, supernovas could not eject the complex elements out into space. If the number of space dimensions were different, planetary systems could not form. If photons did not exist, energy could not flow from stars to plants.
There is only one reasonable explanation for this fine-tuning: there must be many universes, most of which are unsuitable for life, but some of which are. In that way, a simple overall theory can explain a complex particular universe. The anthropic principle states that the only part of reality that we find ourselves in must be such that we can exist.
The multiple universes might be the many solutions to string theory, or could maybe even be all possible mathematical stuctures - the Everything Hypothesis.
The objection might be raised: Could not an intelligent creator explain the fine-tuning? But while a creation scenario can not be disproven, it can NOT be a fundamental explanation for the origin of life, because it merely pushes back the question, to where the intelligence of the creator would have come from. For an intelligent creator to exist, unless by a wholly unbelievable coincidence, in a simple overall theory, we must conclude that he must be a product of Darwinian evolution in his own fine tuned universe. The resulting model is much more complicated than the simple multiverse scenaio, violating Occam's Razor (the principle, used in science, that simpler explanations are more likely). This is explained in detail in Richard Dawkins' excellent book "The God Delusion".
Key concepts: Fine tuning, multiple universes, anthropic principle, Sleeping Beauty (Self-Indication Assumption, Self-Selection Assumption), God Delusion
Wednesday, July 15, 2009
Why MWI?
MWI for the layperson:
In classical mechanics, each particle has a position and velocity. If there are N particles, the state of a physical system at a given time is given by a list of the positions (a configuration, or point in 3N-dimensional configuration space) and of the velocities for each particle:
X1 = (x1,y1,z1)
classical state: (X1(t), V1(t)), (X2(t),V2(t)), (X3(t),V3(t)), ...
In QM, there is instead the wavefunction, psi, which is a complex-number-valued function on what would classically be configuration space plus the space of spin configurations, and is a function of time:
quantum state: psi(X1,S1, X2,S2, X3,S3, ... , t)
(Spin, Si, takes on a small set of discrete values.)
This is a classic way of generalizing something: instead of a point in a space, there is a function on that space. It must be emphasized that the wavefunction is not a function on regular 3-dimensional space, but on the 3N-dimensional space of configurations. This high-dimensional arena is responsible for many of the counterintuitive properties of quantum mechanics.
If the wavefunction is somewhat sharply peaked near a configuration, though with a wavelength small compared to the width of the peak, it will behave a lot like a classical system; the peak will follow a nearly classical trajectory as a function of time. It is natural to conclude that any interesting things done by such a classical system, such as performing computations, will be done by the wavefunction. It is just like a classical world, only a little 'fuzzy' due to the finite width of the peak. Indeed, roughly this picture is probably how most people think of QM, including chemists - a classical world except that electrons and similar particles are spread out instead of being concentrated at a point.
But that is obviously not a complete picture, because the wavefunction is not concentrated around a single peak. Roughly speaking, there are many peaks, representing quite different classical configurations (e.g. the alive or dead configurations of Shrodinger's cat), and many places even away from the peaks where the wavefunction is nonzero. Yet the world we see resembles a classical, single-configuration world. How can we explain that?There are three main approaches. The first is some modification of QM in which only one peak remains, while the others vanish - this is called 'collapse of the wavefunction'. There are three main problems with this: 1) it introduces a lot of complexity to the model which might be avoided by another approach; 2) it violates things physicists like such as conservation laws; and most importantly 3) it doesn't work because generally speaking, in proposed models that give mathematical details of 'collapse', small residues remain in the other parts of the wavefunction. Small or not, these residues still go through trajectories that should give rise to computations and thus observers - unless we have reason to believe that probability is higher in high amplitude regions; but if we do, we might as well just go with the simpler MWI since deriving that is its main problem.
The second main approach is hidden variables. As we have seen, local hidden variable models are ruled out by Bell's theorem, but nonlocal models exist that don't have that problem - most famously, the Pilot Wave Interpretation (PWI). In the PWI, a classical-like configuration point 'surfs' along the wavefunction. It has been shown that under quite general conditions, the probability distribution for the point evolves in time to match the Born Rule of QM.
Two main problems have been raised for the PWI and similar models. 1) It is nonlocal, and has a preferred reference frame contrary to the spirit of relativity. This is really a matter of taste, and I don't consider it a fatal problem at all, though I do think the nonlocality is an undesirable feature if other models can avoid it. 2) More importantly, it doesn't get rid of the other peaks in the wavefunction at all; it just adds a new trajectory of the hidden configuration point. The wavefunction is still there and should still perform all of the interesting computations as it would in the MWI. Thus, the PWI has been called 'Everett in denial'. Valentini [http://arxiv.org/abs/0811.0810] has denied that charge but his straw-like arguments are easily demolished as Brown has done [http://arxiv.org/abs/0901.1278].
I must note an important exception to the many-worlds property of the PWI: In some versions of what is proposed for quantum gravity, the wavefunction of the universe does not evolve as a function of time; this is known as the Wheeler-DeWitt equation. That would seem at first glance to rule out observers in those models (that remains to be seen even for just a wavefunction). However, the PWI hidden variables would evolve in time even though the wavefunction doesn't, making a single-world model out of it. While interesting, I find it implausible that something as complex as the wavefunction of the universe would have to be in such a model could be an initial condition.
Finally, there is the MWI itself, as first proposed by Everett and in various forms by others. In its basic form, this has the simplest mechanics as it just lets the wavefunction evolve over time, adding no hidden variables or collapse-inducing modifications to the dynamics. There are many peaks in the wavefunction which follow various trajectories and implement various computations. [Much more to be said on that rough sketch.] Each individual observer only notices a single classical-like world because that is the one associated with the motion of the peak giving rise to the computations of his own brain; the others don't have any effect on him.This appealingly simple picture, however, raises a problem of its own: In order for our observations to be at all typical, the Born Rule (which relates probabilities to the square of the wavefunction) must hold, at least to some approximation. This means that small amplitude peaks are less probable than large amplitude peaks. Since the trajectory of a peak (which makes it perform computations and so on) does not depend on amplitude, why would that be the case in the MWI?
The possibility of derivation of the Born Rule in the MWI is the central problem in interpretation of QM. If the Born Rule does follow from the MWI, then the case for the MWI is made beyond a reasonable doubt. I will discuss in other posts various attempts to derive it.
If it does not follow, then the problem remains - what interpretations could work? Continuous collapse models and the PWI would still not work because they would still be the MWI in disguise due to having the wavefunction (with its wrong probabilities) as part of their ontology.
One possibility that looks like it should work in any case is making an honest Many-Worlds version of the PWI: having infinitely many sets of the hidden variables. The simplest version, that of having every point in configuration space sporting a wave-surfing hidden variable, is called Continuum Bohmian Mechanics (CBM). These hidden variable worlds could then outnumber the ones in the wavefunction, producing the Born Rule for typical observers. Of course, this model is more complex than the standard MWI. Also, it still would leave the question of what observers are and how to count them.
Quantum gravity remains an unsolved problem, and the solution may play a role in interpretation of QM, perhaps providing a new set of variables to work with.
Another problem is that in the long term, long after normal observers have died out, spontaneously assembled bits of random matter (which a cosmological constant would produce) would eventually include short-lived observers who would outnumber the normal ones over the history of the universe by perhaps an infinite factor. These Boltzmann Brains, and the necessity of getting rid of them in terms of their effect on typical observations, provide important constraints on what the real answers could possibly be. This deserves a post of its own, at least.
Another (and not unrelated) topic that will get its own post is the Everything Hypothesis, which postulates that every possible thing must exist as an explanation for why things are how they are.
Simple proof of Bell's Theorem
The focus here is on the implications of Quantum Mechanics itself. I will not discuss here practicalities of experiments to test those predictions, such as "loopholes" due to low detector efficiencies. Such loopholes are implausible and more recent experiments have closed some or made them even less plausible.
Thought experiment:
Any entangled pair of systems, each with at least two distinguishable states besides for position, could be used. I'll use an entangled pair of spin-1/2 particles for ease of notation.
A pair of spin-1/2 particles are generated which are in the entangled spin singlet state
|psi> = (|+,-> - |-,+>) / 2^(1/2)
One of these particles is sent to Alice; the other is sent to Bob. The two observers may be very far apart.
Alice <----------------- source ---------------> Bob
When an observer measures the component of a spin-1/2 particle's spin along any direction in space (for example, using a Stern-Gerlach device), the result of the measurement is always + or - 1/2 hbar. (hbar is Plank's constant / 2 pi)
In the state psi> above, there is a 50% chance that the spin component will be positive (+) for any direction of measurement for either particle.
In the state psi>, QM predicts that if Alice's particle is measured with the result + in direction A (call this A+, etc.), then measurement on Bob's particle in the same direction A will give the - result, A-. This kind of correllation is called an EPR correllation.
Call this pair of results (A+; A-) where A is the direction and the order indicates which Observer gets each of the results.
Let P(A+; A-) be the probability that (A+; A-) is found, and so on. Clearly, P(A+; A-) = P(A-; A+) = 1/2. There is no need to consider models that don't predict this for the proof, since they already would disagree with the predictions of QM.
Each of the Observers can choose which direction to measure the spin along. In particular, each will choose one of three directions: A, B, or C.
Define Distant Measurement Independence (DMI) as the assumption that the (singular) result of each Observer's measurement can not depend on which direction the other Observer chose to measure along.
Statement of the Theorem: DMI is not consistent with the predictions of QM.
The theorem is often (incorrectly) said to prove that QM is non-local, because a reasonable local model would not allow the direction chosen for a distant measurement to influence the result of the other measurement. That is not the only local possibility!
If DMI is false there are 3 possibilities, of which the first two are taken seriously:
1) Nonlocality: An instant (faster-than-light) hidden signal which conveys the information about the measurement angle (which can be ‘chosen’ right before measurement) to the other particle, no matter where it is or how far away.
2) Multiple outcomes of each measurement actually do occur (as in the MWI).
If all outcomes occur, correlations might be established only after local interactions; see http://arxiv.org/abs/0902.3827
3) “Conspiracy theories” in which the other particle somehow can predict the angle.
Proof:
Assume DMI. It is possible that a model assigns certain additional properties to a particular particle that don't appear in the QM description; these are called hidden variables. These could tell the particle whether to give a "+" or a "-" result as a function of what direction its measurement is made in.
The other particle of the pair would then have to have a similar set of properties but with the opposite instructions. Such hidden variables would be required in order to produce the EPR correlations without violating DMI, because otherwise the other particle would have no way to be certain to give the opposite result when both Observers choose the same direction.
Even though only one measurement on the particle is actually made, it can be thought of as labeled by the hidden variables according to what the outcome of measurement along each of the three directions A, B, and C would have been. Consider Alice's particle to be so labeled.
Let (A+ & B-) mean that the result would be + if measured along direction A and would be - if measured along direction B, and so on. Let P(A+ & B-) be the probability that the hidden variables were such that those would be the results.
The following inequality must hold since more general cases are at least as probable as less general ones:
P(A+ & B-) = P(A+ & B- & C-) + P(A+ & B- & C+) ≤ P(A+ & C-) + P(C+ & B-)
It is not possible to measure Alice's particle along more than one direction, but Bob can help us do the next best thing; because of the EPR correlations, measuring his particle should reveal the opposite of what result Alice's particle would have given. Thus the above inequality is equivalent to
P(A+ ; B+) ≤ P(A+ ; C+) + P(C+ ; B+)
This kind of inequality is called a Bell inequality (of which there are actually several).
Quantum mechanically, P(A+ ; B+) = 1/2 sin^2 (theta(A,B)/2) where theta(A,B) is the angle between A and B; and so on.
For example, say A and B are at a right angle, with the C direction in between them.
theta(A,B) = 90 degrees, and theta(A,C) = theta(C,B) = 45 degrees.
Then P(A+ ; B+) = .25, and P(A+ ; C+) = P(C+ ; B+) = .073
Since .25 > .146, the inequality is violated. This establishes that DMI is not consistent with QM.
1-page Bell's theorem is a one page PDF version of the proof in this post, which can be printed and passed out in the street :)
You can see http://arxiv.org/abs/0902.3827 for another overview of Bell's theorem.
Featured Post
Why MWI?
Before getting into the details of the problems facing the Many-Worlds Interpretation (MWI), it's a good idea to explain why I believe t...