Evolving dark energy?

Don’t be surprised if you keep reading astronomy stories in the news this week — the annual meeting of the American Astronomical Society is underway in Washington DC, and it’s common for groups to announce exciting results at this meeting. Today there was a provocative new claim from Bradley Schaefer at Louisiana State University — the dark energy is evolving in time! (Read about it also from Phil Plait and George Musser.)

Short version of my own take: interesting, but too preliminary to get really excited. Schaefer has used gamma-ray bursts (GRB’s) as standard candles to measure the distance vs. redshift relation deep into the universe’s history — up to redshifts of greater than 6, as opposed to ordinary supernova studies, that are lucky to get much past redshift 1. To pull this off, you want “standard candles” — objects that are really bright (so you can see them far away), and have a known intrinsic luminosity (so you can infer their distance from how bright they appear). True standard candles are hard to find, so we settle for “standardizable” candles — objects that might vary in brightness, but in a way that can be correlated with some other observable property, and therefore accounted for. The classic example is Cepheid variables, which have a relationship between their oscillation period and their intrinsic brightness.

Certain supernovae, known as Type Ia’s, have quite a nice correlation between their peak brightness and the time it takes for them to diminish in brightness. That makes them great standardizable candles, since they’re also really bright. GRB’s are much brighter, but aren’t nearly so easy to standardize — Schaefer used a model in which five different properties were correlated with peak brightness (details). The result? The best fit is a model in which the dark energy density (energy per cubic centimeter) is gradually growing with time, rather than being strictly constant.

GRB Hubble Diagram

If it’s true, this is an amazingly important result. There are four possibilities for why the universe is accelerating: a true cosmological constant (vacuum energy), dynamical (time-dependent) dark energy, a modification of gravity, or something fundamental being missed by all us cosmologists. The first possiblity is the most straightforward and most popular. If it’s not right, the set of theoretical ideas that physicists pursue to help explain the acceleration of the universe will be completely different than if it is right. So we need to know the answer!

What’s more, the best-fit behavior for the dark energy density seems to have it increasing with time, as in phantom energy. In terms of the equation-of-state parameter w, it is less than -1 (or close to -1, but with a positive derivative w’). That’s quite bizarre and unexpected.

GRB w plot

As I said, at this point I’m a bit skeptical, but willing to wait and see. Most importantly, the statistical significance of the finding is only 2.5σ (97% confidence), whereas the informal standard in much of physics for discovering something is 3σ (99% confidence). As a side worry, at these very high redshifts the effect of gravitational lensing becomes crucial. If the light from a GRB passes nearby a mass concentration like a galaxy or cluster, it can easily be amplified in brightness. I am not really an expert on how important this effect is, nor do I know whether it’s been taken into account, but it’s good to keep in mind how little we know about GRB’s and the universe at high redshift more generally.

So my betting money stays on the cosmological constant. But the odds have shifted, just a touch.

Update: Bradley Schaefer, author of the study, was nice enough to leave a detailed comment about what he had actually done and what the implications are. I’m reproducing it here for the benefit of people who don’t necessarily dip into the comments:

Sean has pointed me to this blog and requested me to send along any comments that I might have. His summary at the top is reasonable.

I’d break my results into two parts. The first part is that I’m putting forward a demonstration of a new method to measure Dark Energy by means of using GRBs as standard candles out to high red shift. My work is all rather standard with most everything I’ve done just following what has been in the literature.

The GRB Hubble Diagram has been in print since 2003, with myself and Josh Bloom independently presenting early version in public talks as far back as 2001. Over the past year, several groups have used the GRB Hubble Diagram to starting putting constraints on cosmology. This prior work has always used only one GRB luminosity indicator (various different indicators for the various papers) and for no more than 17 GRBs (neglecting GRBs with only limits).

What I am doing new is I am using much more data and I’m directly addressing the question of the change of the Dark Energy. In all, I am using 52 GRBs and each GRB has 3-4 luminosity indicators on average. So I’ve got a lot more data. And this allows for a demonstration of the GRB Hubble Diagram as a new method.

The advantages of this new method is that it goes to high redshift, that is, it looks at the expansion history of the Universe from 1.7-6.3 in redshift. It is impervious to extinction. Also, I argue that there should be no evolution effects as the GRB luminosity indicators are based on energetics and light travel time (which should not evolve). Another advantage is that we have the data now, with the size of the data base to be doubled within two years by HETE and Swift.

One disadvantage of the GRB Hubble Diagram is that the GRBs are lower in quality than supernovae. Currently my median one sigma error bar is 2.6-times worse in comparing a single GRB and a single supernova. But just as with supernovae, I expect that the accuracy of GRB luminosities can be rapidly improved. [After all, in 1996, I was organizing debates between the gradaute students as to whether Type Ia SNe were standard candles or not.] Another substantial problem that is hard to quantify is that our knowledge of the physical processes in GRBs is not perfect (and certtainly much worse than what we know for SNe). It is rational and prudent for everyone to worry that there are hidden problems (although I now know of none). A simple historical example is how Cepheids were found to have two types with different calibrations.

So the first part of my talk was simply presenting a new method for getting the expansion histoy of the Universe from redshifts up to 6.3. For this, it is pretty confident that the method will work. Inevitably there will be improvements, new data, corrections, and all the usual changes (just as for the supernova).

The second part of my talk was to point out the first results, which I could not avoid giving. It so happens that the first results point against the Cosmological Constant. I agree with Sean that this second part should not be pushed, for various reasons. Foremost is that the result is only 2.5-sigma.

Both parts of my results are being cast onto a background where various large groups are now competing for the a new dedicated satellite.

44 Comments

44 thoughts on “Evolving dark energy?”

  1. I’m betting on a phantom universe. Everything has turned out so weirdly since 1998 that it would be perverse for the Universe to start behaving reasonably. Come on, Mother Nature, go the whole hog!

  2. Yeah I remember that ‘cos I made some stupid ill-thought-out remarks in the early part of the thread. Was hoping nobody would remember those…. sigh.

    -cvj

  3. Moshe, our best hope right now is that detailed measurements of the way in which primordial fluctuations lead to later perturbations in the matter density will cast additional light on the question and perhaps even allow us to distinguish between dark energy models and modifications of gravity.

  4. Hey, you are in charge of the eraser…(and you are exaggerating…), the thread developed into quite an informative discussion once the real experts joined in. What are the chances of getting such up-to-date specific information, from experts you don’t know personally, w/o the help of a blog?

  5. Actually, I try to limit my use of the power of deletion. If I make a stupid remark as part of a discussion, I leave it there as a record of what actually took place in the discussion…just like non-host conributors have their remarks frozen for all to see. It seems only fair to leave it there: As long as it is not offensive to anyone….. (or just soooooooo stupid…..)

    Ok Moshe: We’d better get back to the physics. Recall what happened on that thread of JoAnne’s that time. 🙂

    Cheers,

    -cvj

  6. If I make a stupid remark on the blog, I generally wait a couple of weeks and edit it so that it appears under one of my co-blogger’s names.

    Daniel Holz informs me that Schaefer did at least consider the effects of lensing and claims they are under control. On the other hand, George Musser quotes anonymous sources (just like politics!) as saying “It’s flat wrong” and “Don’t waste your time.” As usual, more data will tell.

  7. Layman’s interlude….

    O. K. so lets assume for a moment that the data holds up. Then from the above commentary we are left with:

    1. dynamical time dependent dark energy

    Or

    2. some type of change in gravity itself

    Or

    3. ????? Something else

    2 Questions

    Is one of these 3 remaining alternatives the lead candidate and why?

    If it turns out that dark energy is time varying, would it be possible to experimentally determine when it “began” I use that term in a very loose naive way deliberately in that it seems like if it varies in time at some point it may have been close to or equal to 0. I guess what I am saying would it be able to be traced back to inflation or would it have “begun” at some later time?

    Thanks,

    Elliot

  8. Elliot–

    One would assume that, in a time variant dark energy model, that the only time that it truly ‘began’ was the big bang. Essentially, the universe would start with this soup of particles, whose density is dominated by radiation, evolve into the current cold dark matter dominated era with the particles frozen into protons electrons and neutrinos, and the radiation component negligible, and then evolve into a dark energy dominated era.

    The weirdness of this whole thing is that the evolution from radiation to matter comes from the fact that if you take a box full of photons, and then expand the walls, the density of the photons reduces as a quartic root, while the density of cold dark matter decreseases as a cube root. If this prediction is correct, then the density of dark energy would actually increase.

    One question: Has anyone tried to model all of this by representing the universe as a 4-D “bubble” in a 5-D spacetime with matter? Cause then one could just think of this as the effects of the pressure from the 5-D matter pushing in on the “bubble.”

  9. Yes Elliot, with the caveat that it is quite consistent with observations to have an oscillating dark energy model (i.e. one in which dark energy was important at many epochs), so long as the oscillations behave the right way.

    About the 5D question. People certainly have tried to explore whether cosmic acceleration can be due to extra dimensional dynamics. I can’t say there’s a counter-example, but nothing compelling has resulted yet to the best of my knowledge.

  10. I’ve a different question about dark matter – does galactic dark matter share the rotation of the galaxy? (If it does, why doesn’t it also hit the Cooperstock and T. problem?)

  11. Thanks Mark/BGS,

    Then we are to suppose that if true, then something even weirder than we thought is likely going on. D. E. was already pretty weird but the cosmological constant idea seemed to at least sort of make sense.

    So what does this do the the “anthropic” approach to “predicting” the value of the cosmological constant?

    (we hear the sound of giggling even from a naive layperson in the background)

    Elliot

  12. Plato, thanks for that link. What people don’t want to see is quantum field theory concepts applied to cosmology. Assume quantum gravity is right. Gauge bosons, gravitons, are exchanged between masses to produce gravitational force.

    This alone is very predictive, because we know the speed of the gauge bosons (light speed, from tests of general relativity), so they’re coming from time-past. There seems to be some kind of fact-blindness which says that calculating gravity this way is just speculation or a pet theory, when QFT implies it. Suppose people just don’t want to see the facts: http://feynman137.tripod.com

  13. Interesting. GRBs as luminosity candles have been bandied around for years…in fact I spent a year (before I jumped ship to work with Sean) working with Dan Reichert, Carlos Graziani and Don Lamb trying to do one of these things Shaefer used : Variability vs Luminosity. At that time there was some claims that the more variable the GRB bursts are, the more luminous it was. But those claims were done using really poor statistics. We want to do it right with good statistics, and it turns out to be a red herring. I think Dan spent more time working on it after that, but I stopped following the issue.

    So I guess I am a bit surprised to see it rear its head again, it could be that they used more recent and better quality data (specifically HETE and SWIFT data, which were not available to us). I wonder what Don has to say about this…

  14. Elliot–

    Don’t worry about this, most of this stuff was completely new to me when I went into grad school.

    using it to predict the CC is probably an overreach, although people say they do this. What it does is give an acceptable range for what it could be.

    The idea is, if the Cosmological constant is too large, then it is impossible for galaxies to form–the CC causes stuff to fly apart, and if it is larger than a certain value, then gravitational bound states don’t really exist, for the most part. If it is impossible for galaxies to form, then we wouldn’t be here to see them. Therefore, the CC must either be between zero and a number of order 10^-120 if we are to exist. A bunch of other versions of this reasoning exist, where they look to see whether or not atoms can form, amino acids can begin to coalesce, etc.

  15. JustAnotherGradStudent

    I remember a few years ago some Australian group(Webb, et al) found time variation in the fine structure constant by looking at distant quasars (astro-ph/0012419). I haven’t heard any recent news on this front, but was wondering if perhaps the variation observed by Schaefer could be at all related to this previous observation (or the converse, rather). I know the Aussies checked for alot of systematic errors, but I’m pretty sure they would have missed a varying lambda!

  16. Sean has pointed me to this blog and requested me to send along any comments that I might have. His summary at the top is reasonable.
    I’d break my results into two parts. The first part is that I’m putting forward a demonstration of a new method to measure Dark Energy by means of using GRBs as standard candles out to high red shift. My work is all rather standard with most everything I’ve done just following what has been in the literature.
    For this, what is new is that I am using

  17. Hi Eugene,

    I wonder what Don has to say about this…

    I think he’s fairly skeptical. He’s quoted in the NYT article if you want to check out what he says. 🙂

    As a GRB guy, I’ll have to say that I’m skeptical also. I’ve done a bit of work on the “standardizing” relations that Schaefer is using, and I have some big reservations about his method. He’s essentially trying a “kitchen sink” method of luminosity estimators: he combines six different estimators of varying reliability and well-testedness. He also includes many GRBs multiple times using different estimators. It’s really hard to say anything about what systematic errors might be plaguing this sort of analysis. And only a 2.5 sigma result on top of that? Hmmmm.

    One question I have for the real cosmologists on this blog is the usefulness of parametrizing the dark energy with w-prime (first order Taylor expansion) given the wide range of redshifts of the GRB sample (z = 0.1 – 6.3) and the relatively recent importance of DE. Thoughts on whether this is legit or not?

    I think that the GRB Hubble diagram will someday be a contendah in the cosmology game, but only with a larger sample of bursts and a single physically motivated and better-understood luminosity estimator. With time we should be able to do this right.

  18. ****Oops, this message is broken up by my accidently hitting a return after a tab. The network link here at the AAS meeting is slow and balky. The message will now be continued****

    The GRB Hubble Diagram has been in print since 2003, with myself and Josh Bloom independently presenting early version in public talks as far back as 2001. Over the past year, several groups have used the GRB Hubble Diagram to starting putting constraints on cosmology. This prior work has always used only one GRB luminosity indicator (various different indicators for the various papers) and for no more than 17 GRBs (neglecting GRBs with only limits).

    What I am doing new is I am using much more data and I’m directly addressing the question of the change of the Dark Energy. In all, I am using 52 GRBs and each GRB has 3-4 luminosity indicators on average. So I’ve got a lot more data. And this allows for a demonstration of the GRB Hubble Diagram as a new method.

    The advantages of this new method is that it goes to high redshift, that is, it looks at the expansion history of the Universe from 1.7-6.3 in redshift. It is impervious to extinction. Also, I argue that there should be no evolution effects as the GRB luminosity indicators are based on energetics and light travel time (which should not evolve). Another advantage is that we have the data now, with the size of the data base to be doubled within two years by HETE and Swift.

    One disadvantage of the GRB Hubble Diagram is that the GRBs are lower in quality than supernovae. Currently my median one sigma error bar is 2.6-times worse in comparing a single GRB and a single supernova. But just as with supernovae, I expect that the accuracy of GRB luminosities can be rapidly improved. [After all, in 1996, I was organizing debates between the gradaute students as to whether Type Ia SNe were standard candles or not.] Another substantial problem that is hard to quantify is that our knowledge of the physical processes in GRBs is not perfect (and certtainly much worse than what we know for SNe). It is rational and prudent for everyone to worry that there are hidden problems (although I now know of none). A simple historical example is how Cepheids were found to have two types with different calibrations.

    So the first part of my talk was simply presenting a new method for getting the expansion histoy of the Universe from redshifts up to 6.3. For this, it is pretty confident that the method will work. Inevitably there will be improvements, new data, corrections, and all the usual changes (just as for the supernova).

    The second part of my talk was to point out the first results, which I could not avoid giving. It so happens that the first results point against the Cosmological Constant. I agree with Sean that this second part should not be pushed, for various reasons. Foremost is that the result is only 2.5-sigma.

    Both parts of my results are being cast onto a background where various large groups are now competing for the a new dedicated satellite.

  19. How powerful is this medium? I think the fact that author of the article is here commenting directly in this thread is a prime example of how effective it can be.

    Elliot

  20. Hey Tim,

    Good to hear from you!

    Thanks for the reply. I’ll go hunt down Don’s comment.

    On your question about w’ being the parameter for varying DE, it’s the standard thing that people do. But that does not mean it’s the only thing nor the right thing. In fact, once you parameterize this way, (say by taylor expanding it), you are secretly restricting the possible class of w(z) in the total parameter space. For example, if you parameterize it using w = w_0 + w’z, fit your observations to it, and you find w’ has to be very small, then if you conclude that rapidly evolving w(z) is ruled out, you are making a mistake. This is because you have ruled out rapidly evolving w(z) by the choice of models.

    It’s legit, but one has to becareful about what it means.

Comments are closed.

Scroll to Top