The lure of blogging is strong. Having guest-posted about problems with eternal inflation, Tom Banks couldn’t resist coming back for more punishment. Here he tackles a venerable problem: the interpretation of quantum mechanics. Tom argues that the measurement problem in QM becomes a lot easier to understand once we appreciate that even classical mechanics allows for non-commuting observables. In that sense, quantum mechanics is “inevitable”; it’s actually classical physics that is somewhat unusual. If we just take QM seriously as a theory that predicts the probability of different measurement outcomes, all is well.
Tom’s last post was “technical” in the sense that it dug deeply into speculative ideas at the cutting edge of research. This one is technical in a different sense: the concepts are presented at a level that second-year undergraduate physics majors should have no trouble following, but there are explicit equations that might make it rough going for anyone without at least that much background. The translation from LaTeX to WordPress is a bit kludgy; here is a more elegant-looking pdf version if you’d prefer to read that.
—————————————-
Rabbi Eliezer ben Yaakov of Nahariya said in the 6th century, “He who has not said three things to his students, has not conveyed the true essence of quantum mechanics. And these are Probability, Intrinsic Probability, and Peculiar Probability”.
Probability first entered the teachings of men through the work of that dissolute gambler Pascal, who was willing to make a bet on his salvation. It was a way of quantifying our risk of uncertainty. Implicit in Pascal’s thinking, and all who came after him was the idea that there was a certainty, even a predictability, but that we fallible humans may not always have enough data to make the correct predictions. This implicit assumption is completely unnecessary and the mathematical theory of probability makes use of it only through one crucial assumption, which turns out to be wrong in principle but right in practice for many actual events in the real world.
For simplicity, assume that there are only a finite number of things that one can measure, in order to avoid too much math. List the possible measurements as a sequence
![]()
The aN are the quantities being measured and each could have a finite number of values. Then a probability distribution assigns a number P(A) between zero and one to each possible outcome. The sum of the numbers has to add up to one. The so called frequentist interpretation of these numbers is that if we did the same measurement a large number of times, then the fraction of times or frequency with which we’d find a particular result would approach the probability of that result in the limit of an infinite number of trials. It is mathematically rigorous, but only a fantasy in the real world, where we have no idea whether we have an infinite amount of time to do the experiments. The other interpretation, often called Bayesian, is that probability gives a best guess at what the answer will be in any given trial. It tells you how to bet. This is how the concept is used by most working scientists. You do a few experiments and see how the finite distribution of results compares to the probabilities, and then assign a confidence level to the conclusion that a particular theory of the data is correct. Even in flipping a completely fair coin, it’s possible to get a million heads in a row. If that happens, you’re pretty sure the coin is weighted but you can’t know for sure.
Physical theories are often couched in the form of equations for the time evolution of the probability distribution, even in classical physics. One introduces “random forces” into Newton’s equations to “approximate the effect of the deterministic motion of parts of the system we don’t observe”. The classic example is the Brownian motion of particles we see under the microscopic, where we think of the random forces in the equations as coming from collisions with the atoms in the fluid in which the particles are suspended. However, there’s no a priori reason why these equations couldn’t be the fundamental laws of nature. Determinism is a philosophical stance, an hypothesis about the way the world works, which has to be subjected to experiment just like anything else. Anyone who’s listened to a geiger counter will recognize that the microscopic process of decay of radioactive nuclei doesn’t seem very deterministic. (more…)





If the interaction is through a Z-boson, the strength is completely calculable. While a “weak” interaction, the Z-boson provides a relatively strong interaction as far as weak interactions go. Indeed, a WIMP exchanging a Z-boson to elastically scatter off a nucleus would have been seen already about a decade ago, and is excluded by about four orders of magnitude by present experiments (i.e., current experiments would have seen roughly 10^4 events, instead of few or none).
However there is a second possibility – that the WIMP interacts through a Higgs boson. The coupling of the Higgs to ordinary matter is orders of magnitude weaker, with a strength 10 – 100 times weaker than the current generation of experiments, but within reach of the next decade’s experiments. This is not something just pointed out now – Burgess, Pospelov and ter Veldhuis pointed this out a decade ago.
More recently, the
We’re very happy to have a guest post from one of the people who is doing exactly that hard work — 



Eugene Lim was one of my first graduate students at the University of Chicago. We violated Lorentz invariance together (it’s not as dirty as it sounds), and he’s since gone on to think about 


In the past couple of decades we have found a shadow biosphere, except that far from lurking in the cracks it turns out to be the biggest, most critical, biosphere on the planet. An astonishing 99.9% of life on Earth cannot be coerced to grow in a lab, and so we have overlooked it. Microbial life – single-celled bacteria and our ancient cousins the Archaea – is not just the stuff under your fingernails, it is what makes multi-cellular life like us function, and it helps govern the grand chemical cycles of our planet, from the continents to the oceans to the atmosphere. Such organisms have, over three to four billion years, evolved into an eye popping array of microscopic machines, the ultimate nano-bots. They can extract energy and raw materials from, it seems, almost any environment. A particularly good example is