Dark Energy: Still a Puzzle

The arrow of time wasn’t the only big science problem garnering media attention last week: there was also a claim that dark energy doesn’t exist. See Space.com (really just a press release), USA Today, and a bizarre op-ed in the Telegraph saying that maybe this means global warming isn’t real either, so there.

The reports are referring to a paper by mathematicians Blake Temple and Joel Smoller, which is behind a paywall at PNAS but publicly available on the arxiv. (And folks wonder why journals are dying.) Now, some of my best friends are mathematicians, and in this paper they do the kind of thing that mathematicians are trained to do: they solve some equations. In particular, they solve Einstein’s equation of general relativity, for the particular case of a giant spherical “wave” in the universe. So instead of a universe that looks basically the same (on large scales) throughout space, they consider a universe with a special point, so that the density changes as you move away from that point.

Then — here’s the important part — they put the Earth right at that point, or close enough. And then they say, “Hey! In a universe like that, if we look at how fast distant galaxies and supernovae are receding from us, we can fit the data without any dark energy!” That is, they can cook up a result for distance vs. redshift in this model that looks like it would in a smooth model with dark energy, even though there’s nothing but ordinary (and dark) matter in their cosmology.

There are three things to note about this result. First, it’s already known; see e.g. Kolb, Marra, and Matarrese, or Clifton, Ferreira, and Land. In fact, I would argue that it’s kind of obvious. When we observe distant galaxies, we don’t see the full three dimensions of space at every moment in time; we can only look back along our own light cone. If the universe isn’t homogeneous, but is only spherically symmetric around our location, I can arrange the velocities of galaxies along that past light cone to do whatever I want. We could have them spell out “Cosmic Variance” in Morse code if we so desired. So it’s not very surprising we could reconstruct the observed distance vs. redshift curve of an accelerating universe; you don’t have to solve Einstein’s equation to do that.

Second, do you really want to put us right at the center of the universe? That’s hard to rule out on the basis of data — although people are working on it. So it’s definitely a possibility to keep in mind. But it seems a bit of a backwards step from Copernicus and all that. Most of us would like to save this as a move of last resort, at least while there are alternatives available.

Third, there are perfectly decent alternatives available! Namely, dark energy, and in particular the cosmological constant. This idea not only fits the data from supernovae concerning the distance vs. redshift relation, but a bunch of other data as well (cosmic microwave background, cluster abundances, baryon acoustic oscillations, etc.), which this new paper doesn’t bother with. People should not be afraid of dark energy. Remember that the problem with the cosmological constant isn’t that it’s mysterious and ill-motivated — it’s that it’s too small! The naive theoretical prediction is larger than what’s required by observation by a factor of 10120. That’s a puzzle, no doubt, but setting it equal to zero doesn’t make the puzzle go away — then it’s smaller than the theoretical prediction by a factor of infinity.

The cosmological constant should exist, and it fits the data. It might not be the right answer, and we should certainly keep looking for alternatives. But my money is on Λ.

55 Comments

55 thoughts on “Dark Energy: Still a Puzzle”

  1. cybertraveller777

    What if an enormous energy field existing outside the void, with a density so great that
    all matter is pulled toward it, and is the source of gravity throughout the void! A fraction
    of that energy set into the center of the void billions of years ago, and all matter has been
    since that time, impelled by it, and being drawn out to it.

  2. There are many other theories about gravity, maybe sapce and time. They can also fit the data. Then, do they all have the Λ, namely the prediction of dark energy?

  3. As far as I know, nobody has very seriously tried to fit the data for an an _anisotropy_ of dark energy yet, so it we weren’t really at the precise center of such a “wave”, that would be consistent with the data as well. (I bug the ESSENCE and SNLS folks about this every time they give a talk — can’t they at least fit for a dipole moment, for goodness sakes…?!)

  4. Point 1 – My understanding on the significance of this paper is that it is the first time expanding wave solutions to the FRW metric have been derived from first principles (i.e. without having to provide an arbitrary acceleration parameter).

    Point 2 – Putting us at the center of the universe is an enormous drawback to their model, though it may satisfy the creationists. However, to be fair, the authors do note that their may be multiple expanding waves in the universe, and we could be at the center of one of those and not one ‘centered’ on the big bang.

    Point 3 – Despite dark energy being a perfectly valid alternative, it remains unsatisfying for some (as does dark matter), simply because of the 10^120 factor that must be overcome with the cosmological constant. Philosophically, you’re correct and that shouldn’t necessarily be a barrier to its legitimacy, given that the discovery is so recent. Nevertheless, I don’t think this is a good criticism of their motivation to pursue an alternative explanation.

  5. Dark energy/ the Big Wave, dark matter, Inflation theory are all ad hoc. All of these hypothesis are necessary to account for observations that do not otherwise make sense. This does not bode well for the standard model. When the correct cosmological model comes along some day there should be much fewer observation surprises and addendum hypothesis such as this one.

  6. Is it possible that dark matter is simply back-scatter of dark energy from within black holes, or at least some complementary manifestation of the dark energy within them?

    The snag is that to allow dark energy to leave the hole the first idea would require this energy to move at superluminal speeds, which seems a big no-no even if it could convey no information.

    Also, one would have to assume its speed could vary, in order that dark energy exiting at just the right speed, of a possible distribution of speeds, would “slow to a standstill” (or approach its minimum speed) in the vicinity of the black hole and tend to pile up there as dark matter.

    All the same, dark energy travelling at superluminal speed(s) would also be one possible explanation of its apparent uniformity outside black holes.

  7. @miller#5
    In quantum mechanics, the energy of a harmonic oscillator (or any simple system possessing a single stable equilibrium) cannot be lower than a certain positive number called the zero-point energy.

    A field theory (electromagnetism, for instance, is a field theory) is similar. The field equal to zero everywhere is an equilibrium, which classically has zero energy. When quantum effects are taken into account, the minimum energy can no longer be zero. There will now be a positive zero-point energy density. If this energy density is constant everywhere, it is indistinguishable from a cosmological constant.

    A first naive assessment of this energy density gives infinity. A slightly less naive assessment notices that the infinity can be eliminated if we discard the effects of field oscillations on Planck time and length scales (postulating some as yet unknown physics as the mechanism). The resulting number is on the order of a Planck mass per Planck volume, which happens to be about 10120 times the observed value. Even more sophisticated assessments note that the previously described prescription for eliminating an infinite zero-point energy density is not unique and that generalizations thereof (each postulating slightly different as yet unknown physics as mechanisms) can predict any value for this energy density, including the observed one.

    With this outlook, the cosmological constant problem can be formulated as the lack of constraints on the theoretical prediction of the zero-point energy density in the universe, other than the observed value of the cosmological constant.

  8. Noname and Albert,

    The arguments are correct, but this is one of the rare papers that I found by searching the literature that make the arguments solid:

    Neutralino dark matter stars can not exist.
    De-Chang Dai, Dejan Stojkovic
    Published in JHEP 0908:052,2009.
    e-Print: arXiv:0902.3662 [hep-ph]

    Best,
    Anna

  9. Copernicus gave us a first approximation. I can live with that. Galileo and Newton gave us a first approximation, and Special Relativity looked to the 2nd order terms. Kepler gave us a first approximation, and I know the man at JPL who tweaks the solar system ephemeris with GR corrections. Maxwell’s equations (as redone by Heaviside) are a first approximation. The Schrodinger equation is a first approximation, with Dirac giving the more general case.Dark Energy data is not conclusive either for nor against. Fine. Let the scientific method operate, without crackpottery nor trollish digressions on religious institutions. Thank you, Sean, for yet another clear, level-headed survey of a controversial subject.

  10. What I don’t get is why a “naive prediction” gets any stock. QFT makes lots of predictions, and the size of the cosmological constant isn’t one of them. That’s the end of the story.

    Why is this “naive prediction” any better than any body else’s wild speculation about what might be true, based on vague physical reasoning?

    Is there any anolog, historically, of a “naive prediction” like this? Was it at all useful? I doubt it.

  11. QFT certainly predicts the value of the cosmological constant. It’s just a constant term in the action, which is renormalized up to a cutoff where new physics kicks in. So if you think we understand physics up to 1 TeV, the vacuum energy density should be at least 60 orders of magnitude larger than we observe; if we understand it up to the Planck scale, it should be at least 120 orders of magnitude bigger. It would be nice to know why the prediction is wrong, because it clearly is, but it’s certainly a prediction.

  12. @1, As a Catholic such a solution does make me happy. What would make me happier however is to allow theories regardless of how daft they may seem to be explored, don;t just discount them.

    Remember Georges Lemaitre was going against the trend of a static universe when he proposed the primeval atom theory – and he was a Catholic priest.

  13. @Ian#39:
    Is nothing too daft? Even the moon made of cheese theory? In the long run, physicists tend to be rather good at figuring out how well a theory works, regardless of its origin. So, if certain theories do get discounted, it’s usually for good reason. Or did you have a specific example in mind?

    Also, I’ll throw in my own non-sequitur. Newton was an alchemist and a theologian much of the time, while Laplace was neither, nor a believer.

  14. This result reminds me a bit of Poincare’s disc world thought experiment. …not sure how that might illuminate things, though. …other than reminding us of the ever-present undetermination of theory by evidence.

  15. About Global Warming, see also here:

    Bottle says:
    9:32 AM
    Hey, can we stay on topic? Which is, “Global warming is caused by the cosmological constant.”

    🙂

  16. I don’t know about anyone else, but I have problems with the whole Dark Matter/Dark Energy idea.

    Correct me if I’m wrong, but the reason the term Dark is used, is because this stuff has been (thus far) undetectable. Not just dark, but undetectable. We’re inferring the presence of DM/DE based upon observations that are fundamentally linked to gravity. The things we can directly detect, aren’t behaving in the way we expect. Thus a clever soul develops DM and then later DE to explain away the discrepancies.

    Now, it’s not a fatal flaw at present. However someone had better start coming up with DIRECT observations of this DM/DE sometime soon. Because as long as DM/DE remains truly dark, it remains in the realm of the speculative. It’s not enough to continuously say “oh well, we ruled out 19 things that were prospective explanations for the Dark things, leaving another 21 things it could be.”

    I’m not telling anyone here anything they don’t already know, but it’s worth restating. A theory worthy of the name has to be falsifiable. Eventually there has to be some direct observational evidence of DM/DE, or it doesn’t exist. Then we’ll have to face the fact that some other mechanism is at work, for instance (and speculatively) our understanding of gravity. Which is not such a crazy idea when you really consider that currently, we are expected to believe in undetectable Dark Matter and undetectable Dark Energy.

    Well, if that’s the case, then I have some undetectable leprechauns that I want to introduce you to!

  17. Hi Sean,

    I’m no expert in QFT, but it’s a revelation to me that something cutoff-dependent could be called an observable. This means that when you formulate a QFT, you must give not only a lagrangian but also a cutoff energy scale. I can’t stop you from calling this a “QFT” but it seems silly to me. You’ve introduced a new constant of nature with a completely bizarre interpretation, and the only prediction it features in is wrong.

    I prefer the vesion of QFT I was taught, which adds only h-bar to the classical list of constants and makes no false predictions.

    -Sam

  18. QFT certainly predicts the value of the cosmological constant.

    That’s just not true. The cc is a superrenormalizable quantity, and you can set it to any value to you want. The problem is that in order to make it small, you have to (just like the Higgs mass) tune the bare value to cancel the large quantum corrections. But there’s nothing in QFT to stop you from doing that; it’s only our philosophical biases against fine tuning that make us not want to.

  19. actually – what are the chances that we live sufficiently close to the center of a nonhomogenious universe compared to the chance that we live in one of the 10^500 string vacua? i suppose the distance-redshift relation only really kicks in beyond our local group. so we need to be in ~1Mlj radus, ~10^-4 of the radius of the universe. so the chances are O(10^-12). not that bad at all. and i am sure some antropic argument can easily be cooked up.

  20. Thanks very much Anna (#33) While not being able to follow the paper in as much detail as I would like at this point, it certainly was instructive. This is going to be a steep learning curve for me, but a very enjoying climb.

  21. Off topic, but on journals, it is still important in a lot of fields to publish in Nature, Science or PNAS. Of those three, PNAS actually has the most non-terrible Open Access policy. I believe all articles are freely available after six months, and authors can choose to pay for their articles to be immediately OA. Usually this would be paid out of a grant (now a standard expense for funding agencies if PIs choose to put it in proposals), and so it’s not as crazy as it might sound.

    Physics has been ahead of the OA game for a while, but I think that the kind of model used by PNAS might be more useful in the longer-term. The process of producing a high quality journal has costs—whereas the arXiv is free, but has some disadvantages. I believe that the most obvious difference, peer review, affects the process in a number of ways. First, there is the idea of quality control—or for a journal of PNAS’s perceived stature, a stamp of quality. If a paper is published in Nature/Science/PNAS, it will be taken very seriously (in most fields) and a lot of people will read it, even if the author is relatively unknown. In physics, my memory is that people look on the arXiv each morning for papers by (a) famous people (b) people they know or (c) papers with something relevant in the title. There are other papers out there which are potentially important and worth reading, but don’t fit these criteria, and the peer review process (ideally) helps you find those papers.

    Second, I think the importance of the review process in fields which don’t make use of the arXiv means that papers are better written. I don’t know the numbers, but I would bet that more time is spent preparing a paper for submission to PNAS (I mean, after the science is already done), than is spent preparing your upload to the arXiv. You might argue that this slows down science (it probably does keep you from getting on with the next project) but I think better-written papers contribute in an important way to the communication of scientific ideas.

    As a final point, again I know it’s off-topic (cosmic variance post on Open Access?), is the physics arXiv model sustainable in the long-term? At the moment physics has (what I see as) an uneasy balance, with peer review coexisting with automatic OA on the arXiv. That means you kind of get the benefits above for free. Papers are immediately available, but in the medium term go through review process. But won’t libraries will stop buying those journals eventually? Leaving physics with only the arXiv. Maybe that’s what a lot of physicists would be happy with, but I think there is a cost to pay. Certainly in other fields which don’t already have an arXiv, I think the PNAS model of author’s paying for OA will be the way forward.

  22. Is there a possibllity that dark matter doesn’t exist? You bet! Dark matter isn’t necessary if the universe only appears to be expanding and is actually in full contraction. Relativity supports the notion that two viewers on seperate rockets can not with any acturacy determine who is traveling and who is not. My suggestion is simply that after inflation the universe went into contrraction. This may also explain why enthropy is growing. The cl;oser we get to the big crunch the higher enthropy will grow.

Comments are closed.

Scroll to Top