Category: Science

  • arxiv Find: A Realistic Cosmological Model…

    The title is a bit misleading; what is being referred to is not a realistic cosmological model at all. But it’s interesting to see that not every professional astronomer believes in the Big Bang model; there are still some out there who are sticking with the Steady State theory. Seriously.

    A Realistic Cosmological Model Based on Observations and Some Theory Developed Over the Last 90 Years
    Authors: Geoffrey Burbidge

    Abstract: This meeting is entitled “A Century of Cosmology.” But most of the papers being given here are based on work done very recently and there is really no attempt being made to critically review what has taken place in the last 90 or 100 years. Instead, in general the participants accept without question that cosmology equates to “hot big bang cosmology” with all of its bells and whistles. All of the theory and the results obtained from observations are interpreted on the assumption that this extremely popular model is the correct one, and observers feel that they have to interpret its results in terms of what this theory allows. No one is attempting to seriously test the model with a view to accepting it or ruling it out. They are aware, as are the theorists, that there are enough free parameters available to fix up almost any model of the type.

    The current scheme given in detail for example by Spergel et al (206, 2007) demonstrates this. How we got to this stage is never discussed, and little or no attention is paid to the observations obtained since the 1960s on activity in the centers of galaxies and what they imply. We shall show that they are an integral part of a realistic cosmological model. In this paper I shall take a different approach, showing first how cosmological ideas have developed over the last 90 years and where mistakes have been made. I shall conclude with a realistic model in which all of the observational material is included, and compare it with the popular model. Not surprisingly I shall show that there remain many unsolved problems, and previously unexpected observations, most of which are ignored or neglected by current observers and theorists, who believe that the hot big bang model must be correct.

    For those with any lingering doubts, the Big Bang model — the idea that the universe has evolved from a hot, dense, smooth initial state — is correct, and the Steady State model should have been put to bed a long time ago. Evidence for the Big Bang is overwhelming. It’s a model that keeps making predictions, which keep turning out to be correct, while the Steady State theory made many predictions that turned out to be wrong.

    But it’s an interesting case study in how science works. Reading Burbidge’s paper, the parallels with anti-evolutionists are striking. In both cases, one is repeatedly told that the establishment’s supporter’s can’t prove that their theory is correct. Which is undeniably true, as science never proves anything; it just accumulates evidence, and in the case of the Big Bang and natural selection, the evidence puts the case beyond reasonable doubt. Which doesn’t imply that there are no interesting questions remaining to be addressed. For both the Big Bang and natural selection, many of the details concerning the way in which the broad framework is specifically implemented in the real world remain to be answered. And in both cases, the skeptics like to pretend that open questions about the details are the same as open questions about the framework. But they’re not.

    Nevertheless, one of the virtues of the tenure system is that a Big Bang skeptic can keep their position as a professor of physics, writing heterodox articles and submitting them to the arxiv. And this really is a virtue, not a flaw. Geoffrey Burbidge has done lots of respectable work in observational astronomy. Long ago, he and his wife Margaret collaborated with Fred Hoyle and Willy Fowler on an important paper that helped established the theory of nucleosynthesis in stars. Part of the motivation for the paper was the realization that conditions in the Big Bang were not right for synthesizing elements much heavier than lithium — you could explain the universe’s helium abundance, but not the existence of carbon and iron and so forth. Hoyle, of course, was one of the originators of the Steady State theory, and that was certainly part of his motivation at the time. As it turns out, in the real world, some elements are synthesized in the early universe, and some in stars, and some in supernovae; the real world can be a messy place.

    Would a young cosmologist who didn’t believe in the Big Bang be offered a faculty job, or receive tenure, today? Probably not. Faculty jobs are scarce commodities, and a university is going to want to hire people who will do interesting and productive work that is of some use to the wider community. Believers in the Steady State model aren’t going to produce such work, any more than creationists or astrologers or experts in the plum-pudding model of the atom. And eventually support for the model will fade away entirely, opening the door for the next generation of heterodoxies.

  • Will NASA Rise from the Ashes?

    mars_phoenix.jpg NASA’s Phoenix Mars Lander, which some time back scraped up direct evidence of water on Mars’s surface, is slipping gently into the night. Not a surprise; the mission was always scheduled to last just a few months, and at this time of Martian year there just isn’t enough sunshine to keep the batteries charged.

    Mission engineers last received a signal from the lander on November 2, the space agency said.

    Rumor has it that the signal read “Yes We Can!”

    The future of NASA is going to be one out of approximately 50 million pressing challenges faced by the new President. Under the previous administration (what was that guy’s name again? I seem to have repressed it), the agency drifted, ranging from embarrassing ideological scandals to hopelessly inept planning to blatant censorship on climate change to a depressing de-emphasis of real science. Obama, and whoever he appoints as NASA administrator, will have a very difficult job balancing competing pressures: rebuilding a science program that has been devastated by funding cuts, while also restoring our capacity to send astronauts into space, and doing so in a time of tremendous budgetary pressures. Darksyde at Daily Kos has a good post about what some of these challenges are, and some of the struggles of current administrator Michael Griffin. It will be very interesting to see what direction the agency takes; in a multipolar world, the U.S. won’t be the only important player in space exploration and space science, but hopefully we won’t just sit on the sidelines, either.

    (Did you notice the link to an article on Discover at the beginning of that paragraph? That’s because, when I cut open a vein to sign our new blogging agreement in blood [don’t worry, it wasn’t my vein], part of the contract was that we would link back to the site in every single blog post we do. I’m sure nobody will notice.)

  • A Special Place in the Universe

    Cosmologists find themselves in this interesting situation where they have a set of hypotheses — dark matter, dark energy, inflation — that serve to make impressively precise predictions that have been tested against a wide variety of data, but presently lack a firm grounding in established physics. We don’t know what exactly the dark matter is, what the dark energy is, or how inflation happened, if indeed it happened at all. So it behooves us to push at the boundaries a bit — start with the simple models and tweak them in some way, and then check whether the new version still fits the data. How confident are we that the dark sector has the properties we think it does, or that inflation happened in a straightforward way?

    This was the philosophy that led Lotty Ackerman, Mark Wise and I to ask what the universe would look like if rotational invariance were violated during inflation — if there were a preferred direction in space, which left some imprint on the cosmological perturbations that currently show up as large-scale structure and temperature fluctuations in the cosmic microwave background. I talked about how that paper came to be in a series of posts: one, two, three. And now there is even tantalizing evidence that our model fits the data! I don’t get too excited about it, but it’s something to keep an eye on as the data improve (e.g. when the Planck satellite gets results).

    Ever since then, Mark and I have toyed with the idea that once you’ve broken rotational invariance, your next step is obvious: violate translational invariance! Instead of imagining a preferred direction in space, imagine there were a preferred place in the universe. Not because you have some good reason to think there is, but because you want to quantify the level of confidence we have in the assumption that there is not.

    So we have now teamed up with Chien-Yao Tseng, another grad student here at Caltech, to do exactly that. The result is this paper:

    Translational Invariance and the Anisotropy of the Cosmic Microwave Background
    Sean M. Carroll, Chien-Yao Tseng and Mark B. Wise

    Primordial quantum fluctuations produced by inflation are conventionally assumed to be statistically homogeneous, a consequence of translational invariance. In this paper we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes $langle a_{lm}a_{l’m’}^*rangle$ of the spherical-harmonic coefficients.

    It took a while to put into equations what exactly was meant by “violating translational invariance” in an operational way. But once you figure it out, it’s obvious, and there are three ways to do it: imagining that there is a preferred point, line, or plane in the universe. Then you hypothesize that the density fluctuations are very slightly modulated in a way that depends on your distance from that preferred place. Once you have that, it’s just a matter of cranking out some monstrous equations. Thank goodness there are only three macroscopic dimensions of space, is all I can say.

    So now we have some predictions to compare with data, so that we can understand exactly how well the cosmic microwave background really assures us that there is no special place in the universe. But aside from the general motivation of being careful to test all of our cherished assumptions, there is another reason for work like this: there are a handful of ways in which cosmological perturbations don’t look completely the same in every direction. As we say in the paper:

    There is another important motivation for studying deviations from pure statistical isotropy of cosmological perturbations: a number of analyses have found evidence that such deviations might exist in the real world. These include the “axis of evil” alignment of low multipoles, the existence of an anomalous cold spot in the CMB, an anomalous dipole power asymmetry, a claimed “dark flow” of galaxy clusters measured by the Sunyaev-Zeldovich effect, as well as a possible detection of a quadrupole power asymmetry of the type predicted by ACW in the WMAP five-year data. In none of these cases is it beyond a reasonable doubt that the effect is more than a statistical fluctuation, or an unknown systematic effect; nevertheless, the combination of all of them is suggestive. It is possible that statistical isotropy/homogeneity is violated at very high significance in some specific fashion that does not correspond precisely to any of the particular observational effects that have been searched for, but that would stand out dramatically in a better-targeted analysis.

    In other words, we have a handful of anomalies, each of which might easily go away, but perhaps when they are taken together they imply that something is going on. Maybe there is some incredibly strong signal out there, and we just haven’t been looking for it in the right way. We won’t know until we understand better how such anomalies would show up in the observations — and then go collect better data.

  • Dark Photons

    It’s humbling to think that ordinary matter, including all of the elementary particles we’ve ever detected in laboratory experiments, only makes up about 5% of the energy density of the universe. The rest, of course, comes in the form of a dark sector: some form of energy density that can be reliably inferred through the gravitational fields it creates, but which we haven’t been able to make or touch directly ourselves.

    It’s irresistible to imagine that the dark sector might be interesting. In other words, thinking like a physicist, it’s natural to wonder whether the dark sector might be complicated, with a rich phenomenology all its own. And in fact there is something interesting going on: over the last 15 years we’ve established that the dark sector comes in at least two different pieces! There is dark matter, 25% of the universe, which we know is like “matter” because it behaves that way — in particular, it clumps together under the force of gravity, and its energy density dilutes away as the universe expands. And then there is dark energy, 70% of the universe, which seems to be eerily uniform — smoothly distributed through space, and persistent (non-diluting) through time. So, there is at least that much structure in the dark sector.

    But so far, there’s no evidence of anything interesting beyond that. Indeed, the individual components of dark matter and dark energy seem relatively vanilla and featureless; more precisely, taking them to be “minimal” provides an extremely good fit to the data. For dark matter, “minimal” means that the particles are cold (slowly moving) and basically non-interacting with each other. For dark energy, “minimal” means that it is perfectly constant throughout space and time — a pure vacuum energy, rather than something more lively.

    Still — all we have are upper limits, not firm conclusions. It’s certainly possible that there is a bushel of interesting physics going on in the dark sector, but it’s just too subtle for us to have noticed yet. So it’s important for we theorists to propose specific, testable models of non-minimal dark sectors, so that observers have targets to shoot for when we try to constrain just how interesting the darkness really is.

    Along those lines, Lotty Ackerman, Matt Buckley, Marc Kamionkowski and I have just submitted a paper that explores what I think is a particularly provocative possibility: that, just like ordinary matter couples to a long-range force known as “electromagnetism” mediated by particles called “photons,” dark matter couples to a new long-range force known (henceforth) as “dark electromagnetism,” mediated by particles known (from now on) as “dark photons.”

    Dark Matter and Dark Radiation
    Authors: Lotty Ackerman, Matthew R. Buckley, Sean M. Carroll, Marc Kamionkowski

    We explore the feasibility and astrophysical consequences of a new long-range U(1) gauge field (“dark electromagnetism”) that couples only to dark matter, not to the Standard Model. The dark matter consists of an equal number of positive and negative charges under the new force, but annihilations are suppressed if the dark matter mass is sufficiently high and the dark fine-structure constant $hatalpha$ is sufficiently small. The correct relic abundance can be obtained if the dark matter also couples to the conventional weak interactions, and we verify that this is consistent with particle-physics constraints. The primary limit on $hatalpha$ comes from the demand that the dark matter be effectively collisionless in galactic dynamics, which implies $hatalpha$ < 10-3 for TeV-scale dark matter. These values are easily compatible with constraints from structure formation and primordial nucleosynthesis. We raise the prospect of interesting new plasma effects in dark matter dynamics, which remain to be explored.

    Just to translate that a bit, here is the idea. We’re imagining there is a completely new kind of photon, which couples to dark matter but not to ordinary matter. So there can be dark electric fields, dark magnetic fields, dark radiation, etc. The dark matter itself consists half of particles with dark charge +1, and half with antiparticles with dark charge -1. Now you might say to yourself, “Why don’t the particles and antiparticles all just annihilate into dark photons?” That kind of thinking is probably why ideas like this weren’t explored twenty years ago (as far as we know). But if you think about it, there is clearly a range of possibilities for which the dark matter doesn’t annihilate very efficiently; for example, if the mass of the individual dark matter particles was sufficiently large, their density would be very low, and they just wouldn’t ever bump into each other. Alternatively, if the strength of the new force was extremely weak, it just wouldn’t be that effective in bringing particles and antiparticles together.

    None of that is surprising; the interesting bit is that when you run the numbers, they turn out to be pretty darn reasonable, as far as particle physics is concerned. For DM particles weighing several hundred times the mass of the proton, there should be about one DM particle per coffee-cup-sized volume of space. The strength of the dark electromagnetic force is characterized, naturally, by the dark fine-structure constant; remember that ordinary electromagnetism is characterized by the ordinary fine-structure constant α = 1/137. It turns out that the upper limit on the dark fine-structure constant required to stop the dark matter particles from annihilating away is — about the same! I was expecting it to be 10-15 or something like that, and it was remarkable that such large values were allowed.

    However, we know a little more about the dark matter than “it doesn’t annhilate.” We also know that it is close to collisionless — dark matter particles don’t bump into each other very often. If they did, all sorts of things would happen to the shape of galaxies and clusters that we don’t actually observe. So there is another limit on the strength of dark electromagnetism: interactions should be sufficiently weak that dark matter particles don’t “cool off” by interacting with each other in galaxies and clusters. That turns into a more stringent bound on the dark fine-structure constant: about an order of magnitude smaller, at $hatalpha$ < 10-3. Still, not so bad.

    More interestingly, we can’t say with perfect confidence that the dark matter really is effectively non-interacting. If a model like ours is right, and the strength of dark electromagnetism is near the upper bound of its allowed value, there might be very important consequences for the evolution of large-scale structure. At the moment, it’s a little bit hard to figure out what those consequences actually are, for mundane calculational reasons. What we are proposing is that the dark matter is really a plasma, and to understand how structure forms, one needs to consider dark magnetohydrodynamics. That’s a non-trivial task, but we’re hoping it will keep a generation of graduate students cheerfully occupied.

    The idea of new forces acting on dark matter is by no means new; I’ve worked on it recently myself, and so have certain co-bloggers. (Strong, silent types who are too proud to blog about their own papers.) What’s exciting about dark photons is that they are much more natural from a particle-physics perspective. Typical models of quintessence and long-range fifth forces invoke scalar fields, which are easy and fun to work with, but which by all rights should have huge masses, and therefore not be very long-range at all. The dark photon comes from a gauge symmetry, just like the ordinary photon, and its masslessness is therefore completely natural.

    Even the dark photon is not new. In a recent paper, Feng, Tu, and Yu proposed not just dark photons, but a barrelful of new dark fields and interactions:

    Thermal Relics in Hidden Sectors
    Authors: Jonathan L. Feng, Huitzu Tu, Hai-Bo Yu

    Dark matter may be hidden, with no standard model gauge interactions. At the same time, in WIMPless models with hidden matter masses proportional to hidden gauge couplings squared, the hidden dark matter’s thermal relic density may naturally be in the right range, preserving the key quantitative virtue of WIMPs. We consider this possibility in detail. We first determine model-independent constraints on hidden sectors from Big Bang nucleosynthesis and the cosmic microwave background. Contrary to conventional wisdom, large hidden sectors are easily accommodated…

    They show that these models manage to evade all sorts of limits you might be worried about, from getting the right relic abundance to fitting in with constraints from primordial nucleosynthesis and the cosmic microwave background.

    Our model is actually simpler, because we have a different flavor of fish to fry: the possible impacts of this new long-range force in the dark sector on observable cosmological dynamics. We’re not sure yet what all of those impacts are, but they are fun to contemplate. And of course, another difference between dark electromagnetism and a boring scalar force is that electromagnetism has both positive and negative charges — thus, both attractive and repulsive forces. (Scalar forces tend to be simply attractive, and get all mixed up with gravity.) So we can imagine much more than a single species of dark matter; what if you had two different types of stable particles that carried dark charge? Then we’d be able to make dark atoms, and could start writing papers on dark chemistry.

    You know that dark biology is not far behind. Someday perhaps we’ll be exchanging signals with the dark internet.

  • Gravity is an Important Force

    Brad DeLong, in re Quantum Hyperion, wonders whether photons are really responsible for the decoherence of Saturn’s moon:

    But gravity works–presumably, at some level–by massive objects constantly bombarding each other with gravitons, so we are also averaging over all the possible states of gravitons that we are not keeping track of, aren’t we? That should cause decoherence too, shouldn’t it?

    This is an annoyingly good question. In fact, I’m probably not giving anything away if I reveal that my esteemed co-blogger Daniel and I once tried to figure out whether or not dark matter, if it truly interacts with ordinary matter only through gravity, would be in a coherent quantum state. Still don’t know the answer (although I strongly suspect it is “no,” I’m just not sure how to prove it).

    The force due to gravity on Hyperion is much larger than the force due to electromagnetism on Hyperion. All else being equal, gravity is a much weaker force, but it has the helpful quality of adding up rather than canceling out, which is why it tends to dominate over astrophysical distances.

    However — it’s not always useful to think of the gravitational force on a planet as due to the exchange of gravitons. You can think of the static force between two objects as arising from the exchange of virtual particles, whether you are talking about gravity or electromagnetism. But it is also true that, in the limit where the bodies giving rise to the gravitational force are perfectly static, those gravitons add up to define a unique quantum state. (The Sun, Saturn, and Titan are not static, but probably good enough for these purposes.) So the state of Hyperion becomes entangled with the quantum states of the individual gravitational fields of those celestial bodies, not with a jillion separate gravitons from each source. When we ignore the quantum states of all the gravitons reflected off of Hyperion, we are ignoring a lot more than when we ignore the quantum states of the gravitational fields of the Sun, Saturn, and Titan.

    So I think it’s the photons, not the gravitons, that are primarily responsible for the decoherence, by a wide margin. But I wouldn’t bet my reputation on it. Maybe Daniel’s reputation.

  • Quantum Hyperion

    One of the annoying/fascinating things about quantum mechanics is the fact the world doesn’t seem to be quantum-mechanical. When you look at something, it seems to have a location, not a superposition of all possible locations; when it travels from one place to another, it seems to take a path, not a sum over all paths. This frustration was expressed by no lesser a person than Albert Einstein, quoted by Abraham Pais, quoted in turn by David Mermin in a lovely article entitled “Is the Moon There when Nobody Looks?“:

    I recall that during one walk Einstein suddenly stopped, turned to me and asked whether I really believed that the moon exists only when I looked at it.

    The conventional quantum-mechanical answer would be “Sure, the moon exists when you’re not looking at it. But there is no such thing as `the position of the moon’ when you are not looking at it.”

    Nevertheless, astronomers over the centuries have done a pretty good job predicting eclipses as if there really was something called `the position of the moon,’ even when nobody (as far as we know) was looking at it. There is a conventional quantum-mechanical explanation for this, as well: the correspondence principle, which states that the predictions of quantum mechanics in the limit of a very large number of particles (a macroscopic body) approach those of classical Newtonian mechanics. This is one of those vague but invaluable rules of thumb that was formulated by Niels Bohr back in the salad days of quantum mechanics. If it sounds a little hand-wavy, that’s because it is.

    The vagueness of the correspondence principle prods a careful physicist into formulating a more precise version, or perhaps coming up with counterexamples. And indeed, counterexamples exist: namely, when the classical predictions for the system in question are chaotic. In chaotic systems, tiny differences in initial conditions grow into substantial differences in the ultimate evolution. It shouldn’t come as any surprise, then, that it is hard to map the predictions for classically chaotic systems onto average values of predictions for quantum observables. Essentially, tiny quantum uncertainties in the state of a chaotic system grow into large quantum uncertainties before too long, and the system is no longer accurately described by a classical limit, even if there are large numbers of particles.

    Some years ago, Wojciech Zurek and Juan Pablo Paz described a particularly interesting real-world example of such a system: Hyperion, a moon of Saturn that features an irregular shape and a spongy surface texture.

    The orbit of Hyperion around Saturn is fairly predictable; happily, even for lumpy moons, the center of mass follows a smooth path. But the orientation of Hyperion, it turns out, is chaotic — the moon tumbles unpredictably as it orbits, as measured by Voyager 2 as well as Earth-based telescopes. Its orbit is highly elliptical, and resonates with the orbit of Titan, which exerts a torque on its axis. If you knew Hyperion’s orientation fairly precisely at some time, it would be completely unpredictable within a month or so (the Lyapunov exponent is about 40 days). More poetically, if you lived there, you wouldn’t be able to predict when the Sun would next rise.

    So — is Hyperion oriented when nobody looks? Zurek and Paz calculate (not recently — this is fun, not breaking news) that if Hyperion were isolated from the rest of the universe, it would evolve into a non-localized quantum state over a period of about 20 years. It’s an impressive example of quantum uncertainty on a macroscopic scale.

    Except that Hyperion is not isolated from the rest of the universe. If nothing else, it’s constantly bombarded by photons from the Sun, as well as from the rest of the universe. And those photons have their own quantum states, and when they bounce off Hyperion the states become entangled. But there’s no way to keep track of the states of all those photons after they interact and go their merry way. So when you speak about “the quantum state of Hyperion,” you really mean the state we would get by averaging over all the possible states of the photons we didn’t keep track of. And that averaging process — considering the state of a certain quantum system when we haven’t kept track of the states of the many other systems with which it is entangled — leads to decoherence. Roughly speaking, the photons bouncing off of Hyperion act like a series of many little “observations of the wavefunction,” collapsing it into a state of definite orientation.

    So, in the real world, not only does this particular moon (of Saturn) exist when we’re not looking, it’s also in a pretty well-defined orientation — even if, in a simple model that excludes the rest of the universe, its wave function would be all spread out after only 20 years of evolution. As Zurek and Paz conclude, “Decoherence caused by the environment … is not a subterfuge of a theorist, but a fact of life.” (As if one could sensibly distinguish between the two.)

    Update: Scientific American has been nice enough to publicly post a feature by Martin Gutzwiller on quantum chaos. Thanks due to George Musser.

  • Broken Symmetries, Mixing Flavors

    I’m traveling, so you will have to rely on some of the many other physics bloggers talking about this year’s Nobel Prize in Physics: to Yoichiro Nambu, Makoto Kobayashi, and Toshihide Maskawa. Nambu was awarded for his work in spontaneous symmetry breaking, while Kobayashi and Maskawa for their work on flavor mixing between quarks. We’ve spoken about spontaneous symmetry breaking before; maybe someday we’ll blog about flavor mixing? The basic idea is that, when a quark decays via the weak interactions, it doesn’t just turn into a single other quark, but a mixture of three different quark flavors. A top quark, for example, can emit a W boson and turns mostly into a bottom quark, but there are trace amounts of down quark and strange quark in there as well.

    Sadly, this is going to be one of those prizes which causes controversy, as some worthy winners were left out — the Nobel charter places a strict limit of three Laureates per Prize. Nambu’s Prize could easily have been shared with Jeffrey Goldstone, another pioneer of spontaneous symmetry breaking. And the Prize for Kobayashi and Maskawa could easily have been shared with Nicola Cabibbo, who worked out the case of two quark generations before it was generalized to three generations by Kobayashi and Maskawa. (There are, to be sure, important differences in the case of three generations; but it’s not called the CKM matrix for nothing.)

    “Sadly,” that is, because the three winners are richly deserving, and shouldn’t have their recognition sullied by bickering over who else should have won. It’s a downside of prizes in general; not everyone can win, not everyone who deserves to. But hopefully it gets some people on the street excited about the exciting world of broken symmetries.

  • Does Space Expand?

    There seems to be something in the air these days that is making people speak out against the idea that space is expanding. For evidence, check out these recent papers:

    The kinematic origin of the cosmological redshift
    Emory F. Bunn, David W. Hogg

    A diatribe on expanding space
    J.A. Peacock

    Expanding Space: the Root of all Evil?
    Matthew J. Francis, Luke A. Barnes, J. Berian James, Geraint F. Lewis

    Admittedly, my first sentence is unfair. The correct thing way to paraphrase the underlying argument here is to say that “space is expanding” is not the right way to think about certain observable properties of particles in general-relativistic cosmologies. These aren’t crackpots arguing against the Big Bang; these are real scientists attacking the Does the Earth move around the Sun? problem. I.e., they are asking whether these are the right words to be attaching to certain indisputable features of a particular theory.

    Respectable scientific theories are phrased as formal systems, usually in terms of equations. But most of us don’t think in equations, we think in words and/or pictures. This is true not only for non-specialists interested in science, but for scientists themselves; we’re not happy to just write down the equations, we want sensible ways to think about them. Inevitably, we “translate” the equations into natural-language words. But these translations aren’t the original theory; they are more like an analogy. And analogies tend to break under pressure.

    So the respectable cosmologists above are calling into question the invocation of expanding space in certain situations. Bunn and Hogg want to argue against a favorite cosmological talking point, that the cosmological redshift is not an old-fashioned Doppler shift, but a novel feature of general relativity due to the expansion of space. Peacock argues against the notion of expanding space more generally, admitting that while it is occasionally well-defined, it often can be exchanged for ordinary Newtonian kinematics by an appropriate choice of coordinates.

    They each have a point. And there are equally valid points for the other side. But it’s not anything to get worked up about. These are not arguments about the theory — everyone agrees on what GR predicts for observables in cosmology. These are only arguments about an analogy, i.e. the translation into English words. For example, the motivation of B&H is to do away with confusions in students caused by the “rubber sheet” analogy for expanding space. Taken too seriously, thinking of space as an expanding rubber sheet convinces students that the galaxy should be expanding, or that Brooklyn should be expanding — and that’s not a prediction of GR, it’s just wrong. In fact, they argue, it is perfectly possible to think of the cosmological redshift as a Doppler shift, and that’s what we should do.

    Well, maybe. On the other hand, there is another pernicious mistake that people tend to make: the tendency, quite understandable in Newtonian mechanics, to talk about the relative speed between two far-away objects. Subtracting vectors at distinct points, if you like. In general relativity, you just can’t do that. And realizing that you just can’t do that helps avoid confusions along the lines of “Don’t sufficiently distant galaxies travel faster than light?” And reifying a distinction between the Doppler shift and the cosmological redshift is a good first step toward appreciating that you can’t compare the velocities of two objects that are far away from each other.

    The point is, arguments about analogies (and, by extension, the proper words in which to translate some well-accepted scientific phenomenon) are not “right” or “wrong.” The analogies are simply “useful” or “useless,” “helpful” or “misleading.” And which of these categories they fall into may depend on the context. Personally, I think “expanding space” is an extremely useful concept. My universe will keep expanding.

  • Templeton and Skeptics

    On the theory that it is good to mention events before they happen, so that interested parties might actually choose to attend, check out the upcoming Skeptics Society conference: Origins: the Big Questions. It will be at Caltech, and will take just one day, Saturday October 4, with a pre-conference dinner the previous night, Friday the 3rd. The day’s events are divided into two parts. In the morning you get a bunch of talks on the origins of big things — I’ll be talking on the origin of time, Leonard Susskind on the origin of the laws of physics, Paul Davies on the origin of the universe, Donald Prothero on the origin of life, and Christof Koch on the origin of consciousness.

    Then in the afternoon they change gears, and start talking about science and religion. Names involved include Stuart Kauffman, Kenneth Miller, Nancey Murphy, Michael Shermer, Philip Clayton, Vic Stenger, and Hugo Ross. It’s this part of the event that has stirred up a tiny bit of controversy, as it is co-sponsored by the John Templeton Foundation, famous appliers of lipstick to the pig that is the interface between science and religion. It’s legitimate to wonder why the Skeptics Society is getting mixed up with Templeton at all, and it’s been discussed a bit in our beloved blogosphere: see Bad Astronomy, Pharyngula, and Richard Dawkins.

    I am on the record as saying that scientists should be extremely leery of accepting money from organizations with any sort of religious orientation, and Templeton in particular. (Happily, in this case the speakers aren’t getting any money at all, so at least that temptation wasn’t part of the calculation.) But it’s by no means a cut-and-dried issue, as we’ve seen in discussions of the Foundational Questions Institute.

    Personally, I prefer not to have the chocolate of my science mixed up with the peanut butter of somebody else’s religion, and certainly not without clear labeling — peanut allergies can be pretty severe. But if someone wants to explicitly put on a peanut butter cup conference, that’s fine, and I don’t have any problem with participating. The problem with the Templeton Foundation is not that they coerce scientists into repudiating their beliefs through the promise of piles of cash; it’s that, by providing easy money to promote certain kinds of discussions, those discussions begin to seem more prominent and important than they really are. Perhaps, without any Templeton funding, the Origins conference would have devoted much less time to the science-and-religion questions, leaving much more time for interesting science discussions. This would have given outsiders a more accurate view of the role that religion plays in current scientific work on these foundational questions: to wit, none whatsoever.

    The Templeton Foundation has every right to exist, and sponsor conferences. And there is undoubtedly a danger among atheists that they get caught up in a “holier than thou” competition — “I’m so atheist that I won’t even talk to people if they believe in God!” Which gets a little silly. I don’t think there’s anything explicitly wrong with the Origins conference; the Templeton-sponsored part is clearly labeled and set off from the rest, and it might end up being interesting. (Also, the conference concludes with Mr. Deity — how awesome is that?) Michael Shermer’s own take is here. But I look forward to a day when discussions of deep questions concerning the origin of the universe and of life can take place without the concept of God ever arising.

  • The Domino Effect

    I gave a talk yesterday at the Center for Inquiry branch here in LA. It was a popular-level spiel on The Origin of the Universe and the Arrow of Time; click for slides. If I had been thinking, I would have advertised the existence of the talk before I had given it, rather than afterward. Either that, or I was trying to smoke out time-travelers.

    But the real reason I’m even bringing it up is to give credit to this great YouTube video, found via Swans on Tea.

    I was literally zipping through blogs yesterday morning while drinking coffee and preparing for the upcoming talk, when up popped this wonderful illustration of entropy and the arrow of time, which naturally I showed at the talk. And it features a kitty. (Schrodinger has his own cat, why shouldn’t Boltzmann?)