Guest Blogger: Joe Polchinski on the String Debates

You may have read here and there about the genteel discussions concerning the status of string theory within contemporary theoretical physics. We’ve discussed it on CV here, here, and even way back here, and Clifford has hosted a multipart discussion at Asymptotia (I, II, III, IV, V, VI).

We are now very happy to host a guest post by the man who wrote the book, as it were, on string theory — Joe Polchinski of the Kavli Institute for Theoretical Physics at UC Santa Barbara. Joe was asked by American Scientist to review Peter Woit’s Not Even Wrong and Lee Smolin’s The Trouble With Physics. Here is a slightly-modified version of the review, enhanced by footnotes that expand on some more technical points.

————————————————————————————

This is a review/response, written some time ago, that has just appeared in American Scientist. A few notes: 1) I did not choose the title, but at least insisted on the question mark so as to invoke Hinchliffe’s rule (if the title is a question, the answer is `no’). 2) Am. Sci. edited my review for style, I have reverted figures of speech that I did not care for. 3) I have added footnotes on some key points. I look forward to comments, unfortunately I will be incommunicado on Dec. 8 and 9.

All Strung Out?

Joe Polchinski

The Trouble with Physics: The Rise of String Theory, the Fall of a Science, and What Comes Next. Lee Smolin. xxiv + 392 pp. Houghton Mifflin, 2006. $26.

Not Even Wrong: The Failure of String Theory and the Search for Unity in Physical Law. xxi + 291 pp. Basic Books, 2006. $26.95.

The 1970’s were an exhilarating time in particle physics. After decades of effort, theoretical physicists had come to understand the weak and strong nuclear forces and had combined them with the electromagnetic force in the so-called Standard Model. Fresh from this success, they turned to the problem of finding a unified theory, a single principle that would account for all three of these forces and the properties of the various subatomic particles. Some investigators even sought to unify gravity with the other three forces and to resolve the problems that arise when gravity is combined with quantum theory.

The Standard Model is a quantum field theory, in which particles behave as mathematical points, but a small group of theorists explored the possibility that under enough magnification, particles would prove to be oscillating loops or strands of “string.” Although this seemingly odd idea attracted little attention at first, by 1984 it had become apparent that this approach was able to solve some key problems that otherwise seemed insurmountable. Rather suddenly, the attention of many of those working on unification shifted to string theory, and there it has stayed since.

Today, after more than 20 years of concentrated effort, what has been accomplished? What has string theory predicted? Lee Smolin, in The Trouble With Physics, and Peter Woit, in Not Even Wrong, argue that string theory has largely failed. What is worse, they contend, too many theorists continue to focus their efforts on this idea, monopolizing valuable scientific resources that should be shifted in more promising directions.

Smolin presents the rise and fall of string theory as a morality play. He accurately captures the excitement that theorists felt at the discovery of this unexpected and powerful new idea. But this story, however grippingly told, is more a work of drama than of history. Even the turning point, the first crack in the facade, is based on a myth: Smolin claims that string theorists had predicted that the energy of the vacuum — something often called dark energy — could not be positive and that the surprising 1998 discovery of the accelerating expansion of the universe (which implies the existence of positive dark energy) caused a hasty retreat. There was, in fact, no such prediction [1]. Although his book is for the most part thoroughly referenced, Smolin cites no source on this point. He quotes Edward Witten, but Witten made his comments in a very different context — and three years after the discovery of accelerating expansion. Indeed, the quotation is doubly taken out of context, because at the same meeting at which Witten spoke, his former student Eva Silverstein gave a solution to the problem about which he was so pessimistic. (Contrary to another myth, young string theorists are not so intimidated by their elders.)

As Smolin charts the fall of string theory, he presents further misconceptions. For example, he asserts that a certain key idea of string theory — something called Maldacena duality, the conjectured equivalence between a string theory defined on one space and a quantum field theory defined on the boundary of that space — makes no precise mathematical statements. It certainly does. These statements have been verified by a variety of methods, including computer simulations [2]. He also asserts that the evidence supports only a weak form of this conjecture, without quantum mechanics. In fact, Juan Maldacena’s theory is fully quantum mechanical [3].

A crucial principle, according to Smolin, is background independence — roughly speaking, consistency with Einstein’s insight that the shape of spacetime is dynamical — and Smolin repeatedly criticizes string theory for not having this property. Here he is mistaking an aspect of the mathematical language being used for one of the physics being described. New physical theories are often discovered using a mathematical language that is not the most suitable for them. This mismatch is not surprising, because one is trying to describe something that is different from anything in previous experience. For example, Einstein originally formulated special relativity in language that now seems clumsy, and it was mathematician Hermann Minkowski’s introduction of four-vectors and spacetime that made further progress possible.

In string theory it has always been clear that the physics is background-independent even if the language being used is not, and the search for a more suitable language continues. Indeed (as Smolin belatedly notes), Maldacena duality provides a solution to this problem, one that is unexpected and powerful. The solution is still not complete: One must pin down spacetime on the edges, but in the middle it is free to twist and even tear as it will, and black holes can form and then decay. This need to constrain the edges is connected with a property known as the holographic principle, which appears to be an essential feature of quantum gravity. Extending this principle to spaces with the edges free will require a major new insight. It is possible that the solution to this problem already exists among the alternative approaches that Smolin favors. But his principal candidate (loop quantum gravity) is, as yet, much more background-dependent than the current form of string theory [4].

Much of Smolin’s criticism of string theory deals with its lack of mathematical rigor. But physics is not mathematics. Physicists work by calculation, physical reasoning, modeling and cross-checking more than by proof, and what they can understand is generally much greater than what can be rigorously demonstrated. For example, quantum field theory, which underlies the Standard Model and much else in physics, is notoriously difficult to put on a rigorous foundation. Indeed, much of the interest that mathematicians have in physics, and in string theory in particular, arises not from its rigor but from the opposite: Physicists by their methods can obtain new results whose mathematical underpinning is not obvious. String theorists have a strong sense that they are discovering something, not inventing it. The process is sometimes messy, with unexpected twists and turns (not least the strings themselves!), and rigor is not the main tool.

Woit covers some of the same ground, although his interests are more centered on particle physics and on the connection with mathematics than on the nature of spacetime. His telling is more direct, but it is rather stuffed with detail and jargon, and his criticisms of string theory are simpler and somewhat repetitious.

A major point for Woit is that no one knows exactly what string theory is, because it is specified only through an infinite mathematical series whose sum is ill-defined. This assertion is partly true: With new physical theories there is often a long period between the first insight and the final mathematical form. For quantum field theory, the state of affairs that Woit describes lasted for half a century [5]. In string theory the situation is much better than he suggests, because for 10 years we have had tools (dualities) that give us in many cases a precise definition of the theory. These have led in turn to many new applications of string theory, such as to the quantum mechanics of black holes, and there are hints to a more complete understanding.

But what about the lack of predictions? This is the key question, for Woit, for Smolin and for string theory. Why have the last 20 years been a time of unusually little contact between theory and experiment? The problem is partly on the experimental side: The Standard Model works too well. It takes great time, ingenuity and resources to try to look beyond it, and often what is found is still the Standard Model.

A second challenge was set forth by Max Planck more than a century ago. When one combines the fundamental constants of special relativity, general relativity and quantum mechanics, one finds that they determine a distance scale at which these theories appear to come together: the Planck length of 10-33 centimeters. To put this number in perspective, one would have to magnify an atom a billion times to make it the size of a coffee cup, and one would have to magnify the Planck length a trillion trillion times to make it the size of an atom. If we could probe the Planck length directly, we would be able to see the strings and extra dimensions, or whatever else is lurking there, and be done with it. But we cannot do that, and so instead we must look for indirect evidence. And, as was the case with atomic theory, one cannot predict how long such a leap will take.

Smolin addresses the problem of the Planck length (“It is a lie,” he says). Indeed, Planck’s calculation applies to a worst-case scenario. String theorists have identified at least half a dozen ways that new physics might arise at accessible scales [6], and Smolin points to another in the theories that he favors [7], but each of these is a long shot. As far as experiment yet shows, Planck’s challenge stands.

Or it may be that string theory has already made a connection with observation — one of immense significance. Positive dark energy is the greatest experimental discovery of the past 30 years regarding the basic laws of physics. Its existence came as a surprise to almost everyone in physics and astronomy, except for a small number, including, in particular, Steven Weinberg.

In the 1980s, Weinberg had been trying to solve the long-standing puzzle of why the density of dark energy is not actually much greater. He argued that if the underlying theory had multiple vacua describing an enormous number of potential universes, it would not only explain why the density of dark energy is not high, but would also predict that it is not zero. Weinberg’s reasoning was contrary to all conventional wisdom, but remarkably his prediction was borne out by observation a decade later.

The connection between string theory and dark energy is still a subject of much controversy, and it may be that Weinberg got the right answer for the wrong reason. However, it may well turn out that he got the right answer for the right reason. If so, it will be one of the great insights in the history of physics, and the multivacuum property of string theory, seemingly one of its main challenges, will, in fact, be just what nature requires.

A second unexpected connection comes from studies carried out using the Relativistic Heavy Ion Collider, a particle accelerator at Brookhaven National Laboratory. This machine smashes together nuclei at high energy to produce a hot, strongly interacting plasma. Physicists have found that some of the properties of this plasma are better modeled (via duality) as a tiny black hole in a space with extra dimensions than as the expected clump of elementary particles in the usual four dimensions of spacetime. The prediction here is again not a sharp one, as the string model works much better than expected. String-theory skeptics could take the point of view that it is just a mathematical spinoff. However, one of the repeated lessons of physics is unity — nature uses a small number of principles in diverse ways. And so the quantum gravity that is manifesting itself in dual form at Brookhaven is likely to be the same one that operates everywhere else in the universe.

A further development over the past few years, as our understanding has deepened, has been the extensive study of the experimental consequences of specific kinds of string theory. Many of these make distinctive predictions for particle physics and cosmology. Most or all of these may well be falsified by experiment (which is, after all, the fate of most new models). The conclusive test of string theory may still be far off, but in the meantime, science proceeds through many small steps.

A central question for both Smolin and Woit is why so many very good scientists continue to work on an idea that has allegedly failed so badly. Both books offer explanations in terms of the sociology of science and the psychology of scientists. These forces do exist, and it is worth reflecting on their possible negative effects, but such influences are not as strong as these authors posit. String theorists include mavericks and contrarians, strong-willed individuals who have made major contributions — not just in string theory but in other parts of physics as well. The borders between string theory and other areas of physics are not closed, and theorists would emigrate if they did not believe that this was the most promising direction in which to invest their time and energies.

In fact, the flow of intellectual talent has been in the other direction: In recent years, leading scientists in particle phenomenology, inflationary cosmology and other fields have found ideas generated by string theory to be useful in their disciplines, just as mathematicians have long done. Many have begun to work with string theorists and have in turn contributed their perspectives to the subject and expanded the view of how string theory relates to nature.

This convergence on an unproven idea is remarkable. Again, it is worth taking a step back and reflecting on whether the net result is the best way to move science forward, and in particular whether young scientists are sufficiently encouraged to think about the big questions of science in new ways. These are important issues — and not simple ones. However, much of what Smolin and Woit attribute to sociology is really a difference of scientific judgment.

In the end, these books fail to capture much of the spirit and logic of string theory. For that, Brian Greene’s The Elegant Universe (first published in 1999) or Leonard Susskind’s The Cosmic Landscape (2005) do a better job. The interested reader might also look to particle-phenomenologist Lisa Randall’s Warped Passages (2005) and cosmologist Alexander Vilenkin’s Many Worlds in One (2006) for accounts by two scientists from other fields who have seen a growing convergence between string theory and their ideas about how the cosmos is put together.

Joseph Polchinski is a professor of physics at the University of California, Santa Barbara, and a permanent member of the Kavli Institute for Theoretical Physics. He is the author of the two-volume text String Theory (Cambridge University Press, 1998).

————————————————————————————

[1] It is obvious that there could have been no such prediction. From 1995-98, string theorists were discovering a host of new nonperturbative tools: dualities, branes, black hole entropy counting, matrix theory, and AdS/CFT duality. These were at the time studied almost exclusively in the context of supersymmetry. The problem of moduli stabilization, necessary for any nonsupersymmetric compactification (and positive energy density states are necessarily nonsupersymmetric) was left for the future; there were no general results or predictions. Page 154 refers to no-go theorems. There was a prominent no-go theorem two years later due to Maldacena and Nunez. However, not only the timing but also the physics is misstated. This paper makes several restrictive assumptions, and gives a long list of well-known papers, some as early as 1986, to which its results simply don’t apply. So this was never a broad constraint on string theory.

[2] On the string theory side, all calculations of anomalous dimensions and correlators represent precise statements about the strong coupling behavior of the gauge theory. However, it is argued on page 282 that the gauge theory is not known to exist. For the purpose of this discussion it is sharpest to focus on the gauge theories in 1+1 and 2+1 dimensions, which were shown by Itzhaki, Maldacena, Sonnenschein, and Yankielowicz to also give background-independent constructions of quantum gravity. These theories are superrenormalizable – their couplings go to zero as powers at short distance — so they are even better-defined than QCD, and one can calculate to arbitrary accuracy on the lattice. Even the supersymmetry is no problem: the lattice breaks it, but because of the superrenormalizability one can calculate explicitly the counterterms needed to restore the symmetry in the continuum limit, and so all the predictions of AdS/CFT can be checked algorithmically.

This has already been done, not by Monte Carlo but by using discrete light-cone quantization, which has the nice property of preserving SUSY and also not paying an extra numerical penalty for large N. The present results of Hiller, Pinsky, Salwen, and Trittman are notable. The error bars are still large (but again, the issue is whether there are predictions in principle, not what can be done with today’s technology) but it does appear that the gauge theory Hilbert space, truncated to 3 x 1012 states, is in fact describing a graviton moving in a curved spacetime. Possibly less algorithmic, but numerically impressive, is the four-loop calculation of Bern, Czakon, Dixon, Kosower, and Smirnov: the Pade extrapolation to strong coupling agrees with the prediction of AdS/CFT to one or two percent.

[3] The gauge theory is a consistent and fully quantum mechanical theory, so if it contains classical gravity then it is by definition a solution to the problem of unifying Einstein’s theory with quantum mechanics. Moreover, the gravitational field must itself be quantized, because the duality relates gauge theory states to correctly quantized graviton states.

It is very difficult to define a `weak form’ of the duality which accounts for all the successful tests and is not actually the strong form. I am taking the definition here from page 144, which refers to classical supergravity as the lowest approximation, and talks about the duality being true only at this lowest order.

However, to get more background I have looked at the relevant papers by Arnsdorf and Smolin and by Smolin. The central arguments of these papers are wrong. One argument is that AdS/CFT duality cannot describe the bending of light by a gravitational field because there is a dual description with a fixed causal structure. If true, of course, this would invalidate the duality, but it is not. The gauge theory has a fixed causal structure, but signals do not move on null geodesics: there is refraction, so signals slow down and bend, and it is this that is dual to the bending of light by a gravitational field. Indeed, this duality between ordinary refraction and gravitational lensing is one of the fascinating maps between gravitation and nongravitational physics that are implied by the duality.

The second argument is that the tests of AdS/CFT duality are consistent with a weaker notion of `conformal induction,’ whereby a boundary theory can be defined from any field theory in AdS space by taking the limit as the correlators approach the boundary. This misses an important point. In general this procedure does not actually define a self-contained field theory on the boundary. Consider a signal in the bulk, which at time t is moving toward the boundary so as to reach it at a later time t’. According to the definition of conformal induction, the existence of this signal is not encoded in the boundary theory at time t, so that theory has no time evolution operator: the state at time t does not determine the state at time t’. In AdS/CFT the boundary is a true QFT, with a time evolution operator, and the signal is encoded even at time t. As a rough model of how this can work, imagine that every one-particle state in the bulk maps to a two-particle state in the boundary, where the separation of the particles plays the role of the radial coordinate: as they come close together the bulk particle move to the boundary, as they separate it moves away. Something like this happens even in real QCD, in the contexts of color transparency and BFKL diffusion.

[4] I am referring here to the problem of the constraints. Until these are solved, one does not really have background independence: there is an enormous Hilbert space, most of which is unphysical. In AdS/CFT, not only the bulk spacetime but also the bulk diffeomorphism group are emergent: the CFT fields are completely invariant under the bulk diffeomorphisms (this is also what happens in the much more common phenomenon of emergent gauge symmetry). In effect the constraints are already solved. One of the lessons of duality is that only the physical information is common to the different descriptions, while the extra gauge structure is not, it is an artifact of language not physics. (The CFT has its own SU(N) gauge invariance, but here it is straightforward to write down invariant objects.)

[5] I am counting from the mid-20’s, when the commutation relations for the electromagnetic field were first written down, to the mid-70’s when lattice gauge theory gave the first reasonably complete definition of a QFT, and when nonperturbative effects began to be understood systematically.

[6] The ones that came to mind were modifications of the gravitational force law on laboratory scales, strings, black holes, and extra dimensions at particle accelerators, cosmic superstrings, and trans-Planckian corrections to the CMB. One might also count more specific cosmic scenarios like DBI inflation, pre-Big-Bang cosmology, the ekpyrotic universe, and brane gas cosmologies.

[7] I have a question about violation of Lorentz invariance, perhaps this is the place to ask it. In the case of the four-Fermi theory of the weak interaction, one could have solved the UV problem in many ways by violating Lorentz invariance, but preservation of Lorentz invariance led almost uniquely to spontaneously broken Yang-Mills theory. Why weren’t Lorentz-breaking cutoffs tried? Because they would have spoiled the success of Lorentz invariance at low energies, through virtual effects. Now, the Standard Model has of order 25 renormalizable parameters, but it would have roughly as many more if Lorentz invariance were not imposed; most of the new LV parameters are known to be zero to high accuracy. So, if your UV theory of gravity violates Lorentz invariance, this should feed down into these low energy LV parameters through virtual effects. Does there exist a framework to calculate this effect? Has it been done?

114 Comments

114 thoughts on “Guest Blogger: Joe Polchinski on the String Debates”

  1. I’ve posted something about this at my blog, but it seems like it would be best for discussion of this to be hosted here, so I’ve turned off comments there, and what follows is the bulk of my posting.

    First of all I should say that I was quite pleased to see Polchinski’s review. While I disagree with much of it, it’s a serious and reasonable response to the two books, the kind of response I was hoping that they would get, opening the possibility of a fruitful discussion.

    Much of Polchinski’s review refers specifically to Smolin’s arguments; some of it deals with the endless debate over “background independence”, and the “emergent” nature of space-time in string theory vs. loop quantum gravity. I’ll leave that argument to others.

    Polchinski notes that I make an important point out of the lack of a non-perturbative formulation of string theory and criticizes this, referring to the existence of non-perturbative definitions based on dualities in certain special backgrounds. The most well-known example of this is AdS/CFT, where it appears that one can simply define string theory in terms of the dual QFT. This gives a string theory with the wrong number of large space-time dimensions (5), and with all sorts of unphysical properties (e.g. exact supersymmetry). If it really works, you’ve got a precisely well-defined string theory, but one that has a low-energy limit completely different than the standard model in 4d that we want. This kind of string theory is well-worth investigating since it may be a useful tool in better understanding QCD, but it just does not and can not give the standard model. The claim of my book is not that string theories are not interesting or sometimes useful, just that they have failed in the main use for which they are being sold, as a unified theory of particle physics and gravity.

    The lack of any progress towards this goal of a unified theory over the past 32 years (counting from the first proposal to use strings to do unification back in 1974) has led string theorists to come up with various dubious historical analogies to justify claiming that 32 years is not an unusual amount of time to investigate a theory and see if it is going to work. In this case Polchinski argues that it took about 50 years to get from the first formulation of QED to a potentially rigorous non-perturbative version of the theory (using lattice gauge theory). The problem with this analogy is of course that in QED non-perturbative effects are pretty much irrelevant, with perturbation theory describing precisely the physics you want to describe and can measure, whereas with string theory the perturbative theory doesn’t connect to the real world. When QED was first written down as a perturbative theory, the first-order terms agreed precisely with experimental results, and if anything like this were true of string theory, we wouldn’t be having this discussion. For the one theory where non-perturbative effects are important, QCD, the time lag between when people figured out what the right theory was, and when its non-perturbative formulation was written down, was just a few months (Wilson was lecturing on lattice gauge theory in the summer of 1973, having taken up the problem earlier in the year after the discovery of asymptotic freedom).

    Polchinski agrees that the key problem for string theory is its inability to come up with predictions about physics at observable energies. He attributes this simply to the fact that the Planck energy is so large, but I think this is misleading. The source of the problem is not really difficulties in extrapolating from the Planck scale down to low energy, but in not even knowing what the theory at the Planck scale is supposed to be (back to that problem about non-perturbative string theory…).

    Weinberg’s anthropic argument for the size of the cosmological constant is described by Polchinski as a possible “prediction” of string theory, and he recommends Susskind’s book as a good description of the latest views of string theorists. I’ve been far too rude to Polchinski in the past in expressing my views about this “anthropic landscape” philosophy, so I won’t go on about it here. He neglects to mention in his review that many of his most prominent colleagues in the string theory community are probably closer in their views on this subject to mine and Smolin’s than to his, and that our books are the only ones I know of that explain the extremely serious problems with the landscape philosophy.

    Recently string theorists have taken to pointing to attempts to use AdS/CFT to say something about heavy-ion physics as a major success of string theory, and Polchinski also does this. I’m no expert on this subject, but those who are like Larry McLerran have recently been extremely publicly critical of claims like the one here that “Physicists have found that some of the properties of this plasma are better modeled (via duality) as a tiny black hole in a space with extra dimensions than as the predicted clump of elementary particles in the usual four dimensions of spacetime.” My impression is that many experts in this subject would take strong exception to the “better” in Polchinski’s claim.

    Finally, about the “sociological” issues, Polchinski disagrees about their importance, believing they are less important than scientific judgments, but I’m pleased to see that he does to some extent acknowledge that there’s a serious question being raised that deserves discussion in the theoretical physics community: “This convergence on an unproven idea is remarkable. Again, it is worth taking a step back and reflecting on whether the net result is the best way to move science forward, and in particular whether young scientists are sufficiently encouraged to think about the big questions of science in new ways. These are important issues — and not simple ones.”

    Again, my thanks to him for his serious and highly reasonable response to the two books.

  2. Alejandro Rivero

    I am not so worried about the issue of having the right or the wrong dimension; what I *really* would like to see is, say, a prediction of 6 flavours in QCD with one of them so masive that it can not bind into mesons. That should be realistic enough to me even if it were in any random number of dimensions.

  3. George,

    I don’t want to claim to be a “string guru”, but about one of your questions:

    1. In a supersymmetric theory, basically the Hamiltonian operator (which gives the energy), is the square of operators that generate the supersymmetry. If a state is supersymmetric, the supersymmetry generators applied to it will give zero, and thus so will their square, the Hamiltonian.

  4. Peter’s answer to George is only the first half of the answer. N.B. that besides zero cosmological constant, negative (as in anti-de-Sitter space) is compatible with supersymmetry. The no-go can be traced back to the fact that there is no globally timelike future directed Killing vector field in de Sitter space (the one that is timelike futuredirected over here is not on ‘the other side of dS’) and the supercharges would have to comute into the Hamiltonian which generates a flow along this field.

  5. IIRC the way Lorentz violation is supposed to show up in loopy physics is that the dispersion relation is violated and the speed of light depends on energy (showing up in early or late arrival of ultra high energy gamma ray burst photons compared to ones of lower energy). The idea is that even if the relative effect is quite small the absolute size could be measurable as these photons have traveled across half the universe. Does anybody have an understanding of how this effect arises? Furthermore, the point seems to be (I think I read this in some abstract) that the loop prediction for this to happen is with a smaller power of (E/M_pl) than string theory making the loop prediction observable with the next generation of instruments while the stringy version is many orders of magnitude smaller.

    Which calculation this referes to? What do I have to compute to get this energy dependent speed of light?

  6. Peter, is this related to what you say in the book about supersymmetry being a sort of square root of translation? How does this argument hold given that supersymmetry must be broken?

    Robert, forgive my laggardly brain, but might you be able to unpack your explanation? All I got is that the issue is related to the causal structure of spacetime.

    George

  7. Joe,

    Thanks for taking the time to write such a thoughtful review.

    Peter,

    Have a look at nucl-th/0604032 and you will find that Larry McLerran
    (or at least his collaborators) describes the calculation of the viscosity
    to entropy ration of the Quark Gluon Plasma via AdS/CFT as
    “An amazing theoretical discovery…”

  8. I’ve promised myself I would take a class on String Theory. I have Polchinski’s books and worked through a few problems from the first few chapters and find it interesting. Thanks Headrick for that very helpful solutions manual!!!

    But I am still unsure what to think about string theory since it has been around so long with nothing concrete. However, I want to take a course so I can see for myself all the details be fore I judge it. Posts like this keep my spirits up. Thanks for sharing it. 🙂

  9. #12.

    Arun, the classical limit on the gauge theory side is the N_c (number of colors) goes to infinity limit. This is “classical” in a sense (quantum loops are suppressed), but it’s not the same limit one would ordinarily think of as the classical Yang-Mills theory. A quantum theory can have different classical limits.

    Is holography preserved in some classical limit?
    Is the bulk theory (though in 5 dimensions) anything like our classical world in this limit?

  10. Does anybody know some good research papers on this mentioned above:

    “A second unexpected connection comes from studies carried out using the Relativistic Heavy Ion Collider, a particle accelerator at Brookhaven National Laboratory. This machine smashes together nuclei at high energy to produce a hot, strongly interacting plasma. Physicists have found that some of the properties of this plasma are better modeled (via duality) as a tiny black hole in a space with extra dimensions than as the expected clump of elementary particles in the usual four dimensions of spacetime.”

    I would love to read the details of this. Thanks.

  11. Robert, afaicr, there is some hope of seeing qg stringy inspired signatures with Planck satelite. The debate I seem to recall was whether it scaled like Mpl^-2 or Mpl^-4. The former, depending on the scale could in principle be seen, the latter otoh would never be seen.

    If LQG predicts a Mpl ^-1, that would be great in the sense that we would have falsifiable material.

  12. Joseph– have a look at this post at Backreaction, which has some references.

    George [24]– some attempts at brief answers.

    1. Keep in mind that almost every state is non-supersymmetric. Being supersymmetric is a very special property, just like being rotationally invariant. There’s a hand-wavy argument that works in flat spacetime: namely, that contributions to the energy from bosons and fermions exactly cancel, and the (vacuum) energy is zero. That’s not quite right in the presence of gravity, as you can also have negative-energy supersymmetric states.

    2. I think nobody knows what happens at the Planck scale. But it would be surprising if it were something simple like a lattice.

    3. That depends on how other things are allowed to vary; it’s certainly a sticky situation. On the one hand, the subspace of parameters in which we could live is certainly a small one. On the other, we don’t really know how big it is, nor what the measure on the space should be. I’m on the side of people who think we have no reason to believe that Weinberg’s assumptions describe the real world, and we need to understand much more before we can claim to have “understood” the value of the cosmological constant, even from anthropic arguments.

  13. There is no known phenomenon that could take strong Lorentz violations at high energies (i.e. a Lorentz-violating cutoff) and weaken them at low energies enough to be compatible with experimental bounds.

    The Feynman checkerboard model of the Dirac propagator in 1+1 dimensions does this, in a sense. There are various generalizations to 3+1 dimensions. Look for “Feynman checkerboard” on google for more.

  14. There is no doubt that String Theory is an experimentally verified theory. In String Theory if it can be reproduced as special effects it must be true. As you all know, most of the predictions of String theory, such as time travel, extradordinary dimensions, (which in string theory simply means the depth of analytical epicycles) are proved by the most famous Doctor of Philosophy Doctor Greene the string theory evangelist in his movies. By experimentally duplicating the predictions made by String Theorists in his movies by state of the art special effects the most famous Doctor Greene experimentally proved the predictions of string theory. (Note also Polchinsky’s reference to computer graphics as proof of stringy scenarios). It is a total ignorance of the elegance of the string to state that string theory makes no experimental predictions.

  15. Sean, in flat space, does the cancellation still occur if supersymmetry is broken? Or to flip the question around, given the degree of supersymmetry-breaking we know must have taken place (or else we’d have seen the sparticles already), does the observed value of lambda make sense?

    Does the flat-space argument beg the question? I.e. if we’re already talking about flat space, doesn’t lambda have to be identically zero, or else space wouldn’t be flat?

    George

  16. George– by “flat space” I meant “with gravity turned off,” sorry for being unclear. The cancellation does not occur once supersymmetry is broken; breaking susy introduces a new, unambiguously positive contribution to the vacuum energy, roughly the susy-breaking scale to the 4th power. Which is wrong by at least 60 orders of magnitude, given that susy is broken at a TeV or above. If it were really true that unbroken supersymmetry implied zero vacuum energy, that would be a flat-out disaster. But we can imagine starting with a supersymmetric state with a large negative vacuum energy, and then breaking susy to contribute a large positive vacuum energy, so that they just about (but not quite) cancel. In the string landscape picture, that is purportedly the kind of state we find ourselves in today.

  17. Thanks to everyone for your comments; most questions seem already to have been ably answered. Just a few remarks:

    Brett #20,22: Thanks for the reference, this is certainly what I would expect. I understand that there is the hope for a `deformed algebra’ rather than a simple violation, but to an outsider it seems that what is being done in LQG is to return to pre-covariant methods of QFT, cut things off in that form, and hope for the best. It would be good to see some calculations.

    George #24: 1) As several people have noted, the supersymmetry algebra is H = sum_i Q_i^2 so the energy is nonnegative and vanishes precisely for supersmmetric states for which all the Q_i annihilate the vacuum. In supergravity there is an additional term -|W|^2 on the RHS, so depending on the value of W supersymmetric states have zero or negative energy but not positive. Robert #29 gives an alternate explanation. In our world, H must be near zero as a result of a near perfect cancellation between +Q^2 and -|W|^2, because SUSY is badly broken.

    2) An example that many people have pointed to is quantum mechanics, which cuts off the classical phase space at a scale hbar, but not by introducing any sort of rigid lattice.

    3) Weinberg was clever, in that you can vary lambda alone without changing anything in the early universe, because Lambda has no effect until recently. Thus he could formulate a well-posed question. When you vary anything else, like the density perturbations, then you also vary things like the amount of inflation and so you need to know the probability measure. There are many people with ideas about this, notably Linde, Vilenkin, Aguirre, Bousso, Easther et al. Different measures give different results. Indeed, even Weinberg’s original assumptions may be wrong. I expect that there is a meaningful probability measure (we already assume such for the inflationary perturbations) and that we need to figure it out: maybe there is some dual form in which it becomes an ordinary QM measure. Anyway, we cannot declare final victory over the c.c. until we have a framework for answering this question.

    Peter #26,35 Joseph #36: I was basing my comments largely on Rajagopal’s talk which seemed quite sober. I also note the very recent comments in Sabine Hossenfelder’s blog, where Bill Zajc, spokesperson for the PHENIX detector, has a great deal to say about the impact of stringy ideas.

    What is the moral? Strong coupling field theory is hard. Nonequilibrium field theory is hard. Nonequilibrium strong coupling field theory is hard^2, and yet here we have one we can solve exactly. It’s not the one we want, but it is not so completely different either, since supersymmetry and conformal invariance are broken at finite temperature. So it should be useful at least as a model, and possibly as a a quantitative guide.

  18. Joe,

    “I was basing my comments largely on Rajagopal’s talk which seemed quite sober.”

    Is that enough to cite an unrefereed talk as evidence, without checking it?

  19. Sean :

    That depends on how other things are allowed to vary; it’s certainly a sticky situation. On the one hand, the subspace of parameters in which we could live is certainly a small one…..

    I am not even sure that I agree that the subspace of parameters in which we could live in is small. Honestly, I don’t know if it is infinitely small, or infinitely big. The problem is, we don’t know what are the allowed range of values that these parameters can vary given some theory (as opposed to what range we can live in), so how do we measure “smallness” or “bigness”?

    More in general about Weinberg’s lambda prediction :

    I don’t think anthropic arguments ala Weinberg is needed to be invoked when deciding which theories/framework/potential/racehorse is “better” (in fact, I think the whole obsession with Weinberg’s anthropic “prediction” is too narrow a viewpoint). Each of these must make predictions on the probability distributions of fundamental parameters, I am sure we can use observations and statistics to decide which is “better”.

    I am not saying anthropic arguments are bad. However I am saying that anthropic arguments are not needed to make sense of probability distributions. We can happily make observations and rule out models/theories based on confidence levels.

  20. Alejandro Rivero

    Small rant about sociology: a thing that enerves me is the concept of “free market of ideas”, when it happens that theoretical physics research (and HEP of course; but well, even education in general) is almost the quintaessential example of a subsidized sector.
    People takes “competition=free market” as a definition. Hey, but also every individual player in a bureaucratic economy is involved in a competition. Even the imperium-wide examinations of the old Chinese system were a kind of competition.
    (I was going to claim that a free market of ideas could be possible under Kropotkinian conditions, but even the almost-Kropotninian economical system pictured by Le Guin in “The Dispossesed” showed inflexibilities).

  21. Very civilized and helpful response, Prof. Polchinski.

    Have to differ on your sentence `Postitive dark energy is the greatest experimental discovery of the past 30 years regarding the basic laws of physics.’

    I’d say the observation of the W and Z in 1980 or so are greater; how easy it is to forget the confusion of that time, with atomic parity violation and weak magnetism confusions casting doubt on the electroweak unification. Non-zero neutrino mass difference is up there too, as it really is not a Standard Model effect.

    Also, dark energy is an observational discovery… no experiment done there. A subtle point, perhaps…

    But those are small potatoes. Even if all the stringy ideas are totally irrelevant to the LHC as I believe to be most probable, string theory is a useful extrapolation of earlier ideas.

  22. WeemaWhopper,

    By ‘positive dark energy’ Prof Polchinski presumably means the secure experimental evidence from supernovae redshifts which show no slowing down in expansion. But there are two : (1) the gravitational retardation of distant galaxies etc is being offset by acceleration due to dark energy, and (2) there is simply no gravitational slowing down mechanism.

    Explanation (1) is mainstream (the lambda-CDM general relativity cosmology), but explanation (2) is championed by Nobel Laureate Philip Anderson, who wrote: ‘the flat universe is just not decelerating, it isn’t really accelerating’ – Philip Anderson, http://blogs.discovermagazine.com/cosmicvariance/2006/01/03/danger-phil-anderson/#comment-10901

    Explanation (2) suggests Standard Model type (Yang-Mills) quantum field theory is the theory of gravity, because you’d expect a weakening in gravitational attraction in any situation when the gravity charges (masses) are rapidly receding from one another, due the “graviton” redshift. Ie, where the visible light from a galaxy is seriously redshifted by recession of the galaxy, the gravitons being exchanged with it will also be severely redshifted (weakening the gravity coupling constant between the two charges), which is a mechanism totally omitted in general relativity. This was predicted ahead of Perlmutter’s observations, unlike explanation (1) which relies on the ad hoc invention of dark energy.

  23. nc, there is a subtle difference between an experiment and an observation. Experiments allow repeatability and some control over conditions. Unfortunately for the big bang, we’re stuck merely observing the one we’ve got, which of course is not Permutter’s or Polchinski’s or anyone else’s fault. Experiment and observation are both empirical. and both bring crucial evidence to the table. But I’d not call the evidence for the dark energy experimental in origin… it is observational in origin.

Comments are closed.

Scroll to Top