Blog

  • The Notorious Delayed-Choice Quantum Eraser

    Note: It is in the nature of book-writing that sometimes you write things that don’t end up appearing in the final book. I had a few such examples for Something Deeply Hidden, my book on quantum mechanics, Many-Worlds, and emergent spacetime. Most were small and nobody will really miss them, but I did feel bad about eliminating my discussion of the “delayed-choice quantum eraser,” an experiment that has caused no end of confusion. So here it is, presented in full. It’s a bit too technical for the book, I don’t know what I was thinking!

    Let’s imagine you’re an undergraduate physics student, taking an experimental lab course, and your professor is in a particularly ornery mood. So she forces you to do a weird version of the double-slit experiment, explaining that this is something called the “delayed-choice quantum eraser.” You think you remember seeing a YouTube video about this once.

    In the conventional double-slit, we send a beam of electrons through two slits and on toward a detecting screen. Each individual electron hits the screen and leaves a dot, but if we build up many such detections, we see an interference pattern of light and dark bands, because the wave function passing through the two slits interferes with itself. But if we also measure which slit each electron goes through, the interference pattern disappears, and we see a smoothed-out distribution at the screen. According to textbook quantum mechanics that’s because the wave function collapsed when we measured it at the slits; according to Many-Worlds it’s because the electron became entangled with the measurement apparatus, decoherence occurred as the apparatus became entangled with the environment, and the wave function branched into separate worlds, in each of which the electron only passes through one of the slits.

    An interference pattern is seen when electrons travel through two slits (left),
    unless a detector measures which slit each electron goes through (right).

    The new wrinkle is that we are still going to “measure” which slit the electron goes through, but instead of reading it out on a big macroscopic dial, we simply store that information in a single qubit. Say that for every “traveling” electron passing through the slits, we have a separate “recording” electron. The pair becomes entangled in the following way: if the traveling electron goes through the left slit, the recording electron is in a spin-up state (with respect to the vertical axis), and if the traveling electron goes through the right, the recording electron is spin-down. We end up with:

    Ψ = (L)[↑] + (R)[↓].

    Our professor, who is clearly in a bad mood, insists that we don’t actually measure the spin of our recording electrons, and we don’t even let them wander off and bump into other things in the room. We carefully trap them and preserve them, perhaps in a magnetic field.

    What do we see at the screen when we do this with many electrons? A smoothed-out distribution with no interference pattern, of course. Interference can only happen when two things contribute to exactly the same wave function, and since the two paths for the traveling electrons are now entangled with the recording electrons, the left and right paths are distinguishable, so we don’t see any interference pattern. In this case it doesn’t matter that we didn’t have honest decoherence; it just matters that the traveling electrons were entangled with the recording electrons. Entanglement of any sort kills interference.

    Of course, we could measure the recording spin if we wanted to. If we measure it along the vertical axis, we will see either [↑] or [↓]. Referring back to the quantum state Ψ above, we see that this will put us in either a universe where the traveling electron went through the left slit, or one where it went through the right slit. At the end of the day, recording the positions of many such electrons when they hit the detection screen, we won’t see any interference.

    Okay, says our somewhat sadistic professor, rubbing her hands together with villainous glee. Now let’s measure all of our recording spins, but this time measure them along the horizontal axis instead of the vertical one. As we saw in Chapter Four, there’s a relationship between the horizontal and vertical spin states; we can write

    [↑] = [→] + [←] ,

    [↓] = [→] – [←].

    (To keep our notation simple we’re ignoring various factors of the square root of two.) So the state before we do such a measurement is

    Ψ = (L)[→] + (L)[←] + (R)[→] – (R)[←]

    = (L + R)[→] + (L – R)[←].

    When we measured the recording spin in the vertical direction, the result we obtained was entangled with a definite path for the traveling electron: [↑] was entangled with (L), and [↓] was entangled with (R). So by performing that measurement, we knew that the electron had traveled through one slit or the other. But now when we measure the recording spin along the horizontal axis, that’s no longer true. After we do each measurement, we are again in a branch of the wave function where the traveling electron passes through both slits. If we measured spin-left, the traveling electron passing through the right slit picks up a minus sign in its contribution to the wave function, but that’s just math.

    By choosing to do our measurement in this way, we have erased the information about which slit the electron went through. This is therefore known as a “quantum eraser experiment.” This erasure doesn’t affect the overall distribution of flashes on the detector screen. It remains smooth and interference-free.

    But we not only have the overall distribution of electrons hitting the detector screen; for each impact we know whether the recording electron was measured as spin-left or spin- right. So, instructs our professor with a flourish, let’s go to our computers and separate the flashes on the detector screen into these two groups — those that are associated with spin- left recording electrons, and those that are associated with spin-right. What do we see now?

    Interestingly, the interference pattern reappears. The traveling electrons associated with spin-left recording electrons form an interference pattern, as do the ones associated with spin-right. (Remember that we don’t see the pattern all at once, it appears gradually as we detect many individual flashes.) But the two interference patterns are slightly shifted from each other, so that the peaks in one match up with the valleys in the other. There was secretly interference hidden in what initially looked like a featureless smudge.

    Adapted from Wikipedia

    In retrospect this isn’t that surprising. From looking at how our quantum state Ψ was written with respect to the spin-left and -right recording electrons, each measurement was entangled with a traveling electron going through both slits, so of course it could interfere. And that innocent-seeming minus sign shifted one of the patterns just a bit, so that when combined together the two patterns could add up to a smooth distribution.

    You professor seems more amazed by this than you are. “Don’t you see,” she exclaims excitedly. “If we didn’t measure the recording photons at all, or if we measured them along the vertical axis, there was no interference anywhere. But if we measured them along the horizontal axis, there secretly was interference, which we could discover by separating out what happens at the screen when the recording spin was left or right.”

    You and your classmates nod their heads, cautiously but with some degree of confusion.

    “Think about what that means! The choice about whether to measure our recording spins vertically or horizontally could have been made long after the traveling photons splashed on the recording screen. As long as we stored our recording spins carefully and protected them from becoming entangled with the environment, we could have delayed that choice until years later.”

    Sure, the class mumbles to themselves. That sounds right.

    “But interference only happens when the traveling electron goes through both slits, and the smooth distribution happens when it goes through only one slit. That decision — go through both slits, or just through one — happens long before we measure the recording electrons! So obviously, our choice to measure them horizontally rather than vertically had to send a signal backward in time to tell the traveling electrons to go through both slits rather than just one!”

    After a short, befuddled pause, the class erupts with objections. Decisions? Backwards in time? What are we talking about? The electron doesn’t make a choice to travel through one slit or the other. Its wave function (and that of whatever it’s entangled with) evolves according to the Schrödinger equation, just like always. The electron doesn’t make choices, it unambiguously goes through both slits, but it becomes entangled along the way. By measuring the recording photons along different directions, we can pick out different parts of that entangled wave function, some of which exhibit interference and others do not. Nothing really went backwards in time. It’s kind of a cool result, but it’s not like we’re building a frickin’ time machine here.

    You and your classmates are right. Your instructor has gotten a little carried away. There’s a temptation, reinforced by the Copenhagen interpretation, to think of an electron as something “with both wave-like and particle-like properties.” If we give into that temptation, it’s a short journey to thinking that the electron must behave in either a wave- like way or a particle-like way when it passes through the slits, and in any given experiment it will be one or the other. And from there, the delayed-choice experiment does indeed tend to suggest that information had to go backwards in time to help the electron make its decision. And, to be honest, there is a tradition in popular treatments of quantum mechanics to make things seem as mysterious as possible. Suggesting that time travel might be involved somehow is just throwing gasoline on the fire.

    All of these temptations should be resisted. The electron is simply part of the wave function of the universe. It doesn’t make choices about whether to be wave-like or particle-like. But a number of serious researchers in quantum foundations really do take the delayed-choice quantum eraser and analogous experiments (which have been successfully performed, by the way) as evidence of retrocausality in nature — signals traveling backwards in time to influence the past. A form of this experiment was originally proposed by none other than John Wheeler, who envisioned a set of telescopes placed on the opposite side of the screen from the slits, which could detect which slit the electrons went through long after they had passed through. Unlike some later commentators, Wheeler didn’t go so far as to suggest retrocausality, and knew better than to insist that an electron is either a particle or a wave at all times.

    There’s no need to invoke retrocausality to explain the delayed-choice experiment. To an Everettian, the result makes perfect sense without anything traveling backwards in time. The trickiness relies on the fact that by becoming entangled with a single recording spin rather than with the environment and its zillions of particles, the traveling electrons only became kind-of decohered. With just a single particle to worry about observing, we are allowed to contemplate measuring it in different ways. If, as in the conventional double- slit setup, we measured the slit through which the traveling electron went via a macroscopic pointing device, we would have had no choice about what was being observed. True decoherence takes a tiny quantum entanglement and amplifies it, effectively irreversibly, into the environment. In that sense the delayed-choice quantum eraser is a useful thought experiment to contemplate the role of decoherence and the environment in measurement.

    But alas, not everyone is an Everettian. In some other versions of quantum mechanics, wave functions really do collapse, not just the apparent collapse that decoherence provides us with in Many-Worlds. In a true collapse theory like GRW, the process of wave- function collapse is asymmetric in time; wave functions collapse, but they don’t un- collapse. If you have collapsing wave functions, but for some reason also want to maintain an overall time-symmetry to the fundamental laws of physics, you can convince yourself that retrocausality needs to be part of the story.

    Or you can accept the smooth evolution of the wave function, with branching rather than collapses, and maintain time-symmetry of the underlying equations without requiring backwards-propagating signals or electrons that can’t make up their mind.

  • Spacetime and Geometry: Now at Cambridge University Press

    Hard to believe it’s been 15 years since the publication of Spacetime and Geometry: An Introduction to General Relativity, my graduate-level textbook on everyone’s favorite theory of gravititation. The book has become quite popular, being used as a text in courses around the world. There are a lot of great GR books out there, but I felt another one was needed that focused solely on the idea of “teach students general relativity.” That might seem like an obvious goal, but many books also try to serve as reference books, or to put forward a particular idiosyncratic take on the subject. All I want to do is to teach you GR.

    And now I’m pleased to announce that the book is changing publishers, from Pearson to Cambridge University Press. Even with a new cover, shown above.

    I must rush to note that it’s exactly the same book, just with a different publisher. Pearson was always good to me, I have no complaints there, but they are moving away from graduate physics texts, so it made sense to try to find S&G a safe permanent home.

    Well, there is one change: it’s cheaper! You can order the book either from CUP directly, or from other outlets such as Amazon. Copies had been going for roughly $100, but the new version lists for only $65 — and if the Amazon page is to be believed, it’s currently on sale for an amazing $46. That’s a lot of knowledge for a minuscule price. I’d rush to snap up copies for you and your friends, if I were you.

    My understanding is that copies of the new version are not quite in stores yet, but they’re being printed and should be there momentarily. Plenty of time for courses being taught this Fall. (Apologies to anyone who has been looking for the book over the past couple of months, when it’s been stuck between publishers while we did the handover.)

    Again: it’s precisely the same book. I have thought about doing revisions to produce an actually new edition, but I think about many things, and that’s not a super-high priority right now. Maybe some day.

    Thanks to everyone who has purchased Spacetime and Geometry over the years, and said such nice things about it. Here’s to the next generation!

  • True Facts About Cosmology (or, Misconceptions Skewered)

    I talked a bit on Twitter last night about the Past Hypothesis and the low entropy of the early universe. Responses reminded me that there are still some significant misconceptions about the universe (and the state of our knowledge thereof) lurking out there. So I’ve decided to quickly list, in Tweet-length form, some true facts about cosmology that might serve as a useful corrective. I’m also putting the list on Twitter itself, and you can see comments there as well.

    1. The Big Bang model is simply the idea that our universe expanded and cooled from a hot, dense, earlier state. We have overwhelming evidence that it is true.
    2. The Big Bang event is not a point in space, but a moment in time: a singularity of infinite density and curvature. It is completely hypothetical, and probably not even strictly true. (It’s a classical prediction, ignoring quantum mechanics.)
    3. People sometimes also use “the Big Bang” as shorthand for “the hot, dense state approximately 14 billion years ago.” I do that all the time. That’s fine, as long as it’s clear what you’re referring to.
    4. The Big Bang might have been the beginning of the universe. Or it might not have been; there could have been space and time before the Big Bang. We don’t really know.
    5. Even if the BB was the beginning, the universe didn’t “pop into existence.” You can’t “pop” before time itself exists. It’s better to simply say “the Big Bang was the first moment of time.” (If it was, which we don’t know for sure.)
    6. The Borde-Guth-Vilenkin theorem says that, under some assumptions, spacetime had a singularity in the past. But it only refers to classical spacetime, so says nothing definitive about the real world.
    7. The universe did not come into existence “because the quantum vacuum is unstable.” It’s not clear that this particular “Why?” question has any answer, but that’s not it.
    8. If the universe did have an earliest moment, it doesn’t violate conservation of energy. When you take gravity into account, the total energy of any closed universe is exactly zero.
    9. The energy of non-gravitational “stuff” (particles, fields, etc.) is not conserved as the universe expands. You can try to balance the books by including gravity, but it’s not straightforward.
    10. The universe isn’t expanding “into” anything, as far as we know. General relativity describes the intrinsic geometry of spacetime, which can get bigger without anything outside.
    11. Inflation, the idea that the universe underwent super-accelerated expansion at early times, may or may not be correct; we don’t know. I’d give it a 50% chance, lower than many cosmologists but higher than some.
    12. The early universe had a low entropy. It looks like a thermal gas, but that’s only high-entropy if we ignore gravity. A truly high-entropy Big Bang would have been extremely lumpy, not smooth.
    13. Dark matter exists. Anisotropies in the cosmic microwave background establish beyond reasonable doubt the existence of a gravitational pull in a direction other than where ordinary matter is located.
    14. We haven’t directly detected dark matter yet, but most of our efforts have been focused on Weakly Interacting Massive Particles. There are many other candidates we don’t yet have the technology to look for. Patience.
    15. Dark energy may not exist; it’s conceivable that the acceleration of the universe is caused by modified gravity instead. But the dark-energy idea is simpler and a more natural fit to the data.
    16. Dark energy is not a new force; it’s a new substance. The force causing the universe to accelerate is gravity.
    17. We have a perfectly good, and likely correct, idea of what dark energy might be: vacuum energy, a.k.a. the cosmological constant. An energy inherent in space itself. But we’re not sure.
    18. We don’t know why the vacuum energy is much smaller than naive estimates would predict. That’s a real puzzle.
    19. Neither dark matter nor dark energy are anything like the nineteenth-century idea of the aether.

    Feel free to leave suggestions for more misconceptions. If they’re ones that I think many people actually have, I might add them to the list.

  • Thanksgiving

    This year we give thanks for an historically influential set of celestial bodies, the moons of Jupiter. (We’ve previously given thanks for the Standard Model Lagrangian, Hubble’s Law, the Spin-Statistics Theorem, conservation of momentum, effective field theory, the error bar, gauge symmetry, Landauer’s Principle, the Fourier Transform, Riemannian Geometry, the speed of light, and the Jarzynski equality.)

    For a change of pace this year, I went to Twitter and asked for suggestions for what to give thanks for in this annual post. There were a number of good suggestions, but two stood out above the rest: @etandel suggested Noether’s Theorem, and @OscarDelDiablo suggested the moons of Jupiter. Noether’s Theorem, according to which symmetries imply conserved quantities, would be a great choice, but in order to actually explain it I should probably first explain the principle of least action. Maybe some other year.

    And to be precise, I’m not going to bother to give thanks for all of Jupiter’s moons. 78 Jovian satellites have been discovered thus far, and most of them are just lucky pieces of space debris that wandered into Jupiter’s gravity well and never escaped. It’s the heavy hitters — the four Galilean satellites — that we’ll be concerned with here. They deserve our thanks, for at least three different reasons!

    Reason One: Displacing Earth from the center of the Solar System

    Galileo discovered the four largest moons of Jupiter — Io, Europa, Ganymede, and Callisto — back in 1610, and wrote about his findings in Sidereus Nuncius (The Starry Messenger). They were the first celestial bodies to be discovered using that new technological advance, the telescope. But more importantly for our present purposes, it was immediately obvious that these new objects were orbiting around Jupiter, not around the Earth.

    All this was happening not long after Copernicus had published his heliocentric model of the Solar System in 1543, offering an alternative to the prevailing Ptolemaic geocentric model. Both models were pretty good at fitting the known observations of planetary motions, and both required an elaborate system of circular orbits and epicycles — the realization that planetary orbits should be thought of as ellipses didn’t come along until Kepler published Astronomia Nova in 1609. As everyone knows, the debate over whether the Earth or the Sun should be thought of as the center of the universe was a heated one, with the Roman Catholic Church prohibiting Copernicus’s book in 1616, and the Inquisition putting Galileo on trial in 1633. (more…)

  • Atiyah and the Fine-Structure Constant

    Sir Michael Atiyah, one of the world’s greatest living mathematicians, has proposed a derivation of α, the fine-structure constant of quantum electrodynamics. A preprint is here. The math here is not my forte, but from the theoretical-physics point of view, this seems misguided to me.

    (He’s also proposed a proof of the Riemann conjecture, I have zero insight to give there.)

    Caveat: Michael Atiyah is a smart cookie and has accomplished way more than I ever will. It’s certainly possible that, despite the considerations I mention here, he’s somehow onto something, and if so I’ll join in the general celebration. But I honestly think what I’m saying here is on the right track.

    In quantum electrodynamics (QED), α tells us the strength of the electromagnetic interaction. Numerically it’s approximately 1/137. If it were larger, electromagnetism would be stronger, atoms would be smaller, etc; and inversely if it were smaller. It’s the number that tells us the overall strength of QED interactions between electrons and photons, as calculated by diagrams like these.
    As Atiyah notes, in some sense α is a fundamental dimensionless numerical quantity like e or π. As such it is tempting to try to “derive” its value from some deeper principles. Arthur Eddington famously tried to derive exactly 1/137, but failed; Atiyah cites him approvingly.

    But to a modern physicist, this seems like a misguided quest. First, because renormalization theory teaches us that α isn’t really a number at all; it’s a function. In particular, it’s a function of the total amount of momentum involved in the interaction you are considering. Essentially, the strength of electromagnetism is slightly different for processes happening at different energies. Atiyah isn’t even trying to derive a function, just a number.

    This is basically the objection given by Sabine Hossenfelder. But to be as charitable as possible, I don’t think it’s absolutely a knock-down objection. There is a limit we can take as the momentum goes to zero, at which point α is a single number. Atiyah mentions nothing about this, which should give us skepticism that he’s on the right track, but it’s conceivable.

    More importantly, I think, is the fact that α isn’t really fundamental at all. The Feynman diagrams we drew above are the simple ones, but to any given process there are also much more complicated ones, e.g.

    And in fact, the total answer we get depends not only on the properties of electrons and photons, but on all of the other particles that could appear as virtual particles in these complicated diagrams. So what you and I measure as the fine-structure constant actually depends on things like the mass of the top quark and the coupling of the Higgs boson. Again, nowhere to be found in Atiyah’s paper.

    Most importantly, in my mind, is that not only is α not fundamental, QED itself is not fundamental. It’s possible that the strong, weak, and electromagnetic forces are combined into some Grand Unified theory, but we honestly don’t know at this point. However, we do know, thanks to Weinberg and Salam, that the weak and electromagnetic forces are unified into the electroweak theory. In QED, α is related to the “elementary electric charge” e by the simple formula α = e2/4π. (I’ve set annoying things like Planck’s constant and the speed of light equal to one. And note that this e has nothing to do with the base of natural logarithms, e = 2.71828.) So if you’re “deriving” α, you’re really deriving e.

    But e is absolutely not fundamental. In the electroweak theory, we have two coupling constants, g and g’ (for “weak isospin” and “weak hypercharge,” if you must know). There is also a “weak mixing angle” or “Weinberg angle” θW relating how the original gauge bosons get projected onto the photon and W/Z bosons after spontaneous symmetry breaking. In terms of these, we have a formula for the elementary electric charge: e = g sinθW. The elementary electric charge isn’t one of the basic ingredients of nature; it’s just something we observe fairly directly at low energies, after a bunch of complicated stuff happens at higher energies.

    Not a whit of this appears in Atiyah’s paper. Indeed, as far as I can tell, there’s nothing in there about electromagnetism or QED; it just seems to be a way to calculate a number that is close enough to the measured value of α that he could plausibly claim it’s exactly right. (Though skepticism has been raised by people trying to reproduce his numerical result.) I couldn’t see any physical motivation for the fine-structure constant to have this particular value

    These are not arguments why Atiyah’s particular derivation is wrong; they’re arguments why no such derivation should ever be possible. α isn’t the kind of thing for which we should expect to be able to derive a fundamental formula, it’s a messy low-energy manifestation of a lot of complicated inputs. It would be like trying to derive a fundamental formula for the average temperature in Los Angeles.

    Again, I could be wrong about this. It’s possible that, despite all the reasons why we should expect α to be a messy combination of many different inputs, some mathematically elegant formula is secretly behind it all. But knowing what we know now, I wouldn’t bet on it.

  • Mindscape Podcast

    For anyone who hasn’t been following along on other social media, the big news is that I’ve started a podcast, called Mindscape. It’s still young, but early returns are promising!

    I won’t be posting each new episode here; the podcast has a “blog” of its own, and episodes and associated show notes will be published there. You can subscribe by RSS as usual, or there is also an email list you can sign up for. For podcast aficionados, Mindscape should be available wherever finer podcasts are served, including iTunes, Google Play, Stitcher, Spotify, and so on.

    As explained at the welcome post, the format will be fairly conventional: me talking to smart people about interesting ideas. It won’t be all, or even primarily, about physics; much of my personal motivation is to get the opportunity to talk about all sorts of other interesting things. I’m expecting there will be occasional solo episodes that just have me rambling on about one thing or another.

    We’ve already had a bunch of cool guests, check these out:

    And there are more exciting episodes on the way. Enjoy, and spread the word!

  • On Civility

    Alex Wong/Getty Images

    White House Press Secretary Sarah Sanders went to have dinner at a local restaurant the other day. The owner, who is adamantly opposed to the policies of the Trump administration, politely asked her to leave, and she did. Now (who says human behavior is hard to predict?) an intense discussion has broken out concerning the role of civility in public discourse and our daily life. The Washington Post editorial board, in particular, called for public officials to be allowed to eat in peace, and people have responded in volume.

    I don’t have a tweet-length response to this, as I think the issue is more complex than people want to make it out to be. I am pretty far out to one extreme when it comes to the importance of engaging constructively with people with whom we disagree. We live in a liberal democracy, and we should value the importance of getting along even in the face of fundamentally different values, much less specific political stances. Not everyone is worth talking to, but I prefer to err on the side of trying to listen to and speak with as wide a spectrum of people as I can. Hell, maybe I am even wrong and could learn something.

    On the other hand, there is a limit. At some point, people become so odious and morally reprehensible that they are just monsters, not respected opponents. It’s important to keep in our list of available actions the ability to simply oppose those who are irredeemably dangerous/evil/wrong. You don’t have to let Hitler eat in your restaurant.

    This raises two issues that are not so easy to adjudicate. First, where do we draw the line? What are the criteria by which we can judge someone to have crossed over from “disagreed with” to “shunned”? I honestly don’t know. I tend to err on the side of not shunning people (in public spaces) until it becomes absolutely necessary, but I’m willing to have my mind changed about this. I also think the worry that this particular administration exhibits authoritarian tendencies that could lead to a catastrophe is not a completely silly one, and is at least worth considering seriously.

    More importantly, if the argument is “moral monsters should just be shunned, not reasoned with or dealt with constructively,” we have to be prepared to be shunned ourselves by those who think that we’re moral monsters (and those people are out there).  There are those who think, for what they take to be good moral reasons, that abortion and homosexuality are unforgivable sins. If we think it’s okay for restaurant owners who oppose Trump to refuse service to members of his administration, we have to allow staunch opponents of e.g. abortion rights to refuse service to politicians or judges who protect those rights.

    The issue becomes especially tricky when the category of “people who are considered to be morally reprehensible” coincides with an entire class of humans who have long been discriminated against, e.g. gays or transgender people. In my view it is bigoted and wrong to discriminate against those groups, but there exist people who find it a moral imperative to do so. A sensible distinction can probably be made between groups that we as a society have decided are worthy of protection and equal treatment regardless of an individual’s moral code, so it’s at least consistent to allow restaurant owners to refuse to serve specific people they think are moral monsters because of some policy they advocate, while still requiring that they serve members of groups whose behaviors they find objectionable.

    The only alternative, as I see it, is to give up on the values of liberal toleration, and to simply declare that our personal moral views are unquestionably the right ones, and everyone should be judged by them. That sounds wrong, although we do in fact enshrine certain moral judgments in our legal codes (murder is bad) while leaving others up to individual conscience (whether you want to eat meat is up to you). But it’s probably best to keep that moral core that we codify into law as minimal and widely-agreed-upon as possible, if we want to live in a diverse society.

    This would all be simpler if we didn’t have an administration in power that actively works to demonize immigrants and non-straight-white-Americans more generally. Tolerating the intolerant is one of the hardest tasks in a democracy.

     

     

  • Intro to Cosmology Videos

    In completely separate video news, here are videos of lectures I gave at CERN several years ago: “Cosmology for Particle Physicists” (May 2005). These are slightly technical — at the very least they presume you know calculus and basic physics — but are still basically accurate despite their age.

    1. Introduction to Cosmology
    2. Dark Matter
    3. Dark Energy
    4. Thermodynamics and the Early Universe
    5. Inflation and Beyond

    Update: I originally linked these from YouTube, but apparently they were swiped from this page at CERN, and have been taken down from YouTube. So now I’m linking directly to the CERN copies. Thanks to commenters Bill Schempp and Matt Wright.

  • User-Friendly Naturalism Videos

    Some of you might be familiar with the Moving Naturalism Forward workshop I organized way back in 2012. For two and a half days, an interdisciplinary group of naturalists (in the sense of “not believing in the supernatural”) sat around to hash out the following basic question: “So we don’t believe in God, what next?” How do we describe reality, how can we be moral, what are free will and consciousness, those kinds of things. Participants included Jerry Coyne, Richard Dawkins, Terrence Deacon, Simon DeDeo, Daniel Dennett, Owen Flanagan, Rebecca Newberger Goldstein, Janna Levin, Massimo Pigliucci, David Poeppel, Nicholas Pritzker, Alex Rosenberg, Don Ross, and Steven Weinberg.

    Happily we recorded all of the sessions to video, and put them on YouTube. Unhappily, those were just unedited proceedings of each session — so ten videos, at least an hour and a half each, full of gems but without any very clear way to find them if you weren’t patient enough to sift through the entire thing.

    No more! Thanks to the heroic efforts of Gia Mora, the proceedings have been edited down to a number of much more accessible and content-centered highlights. There are over 80 videos (!), with a median length of maybe 5 minutes, though they range up to about 20 minutes and down to less than one. Each video centers on a particular idea, theme, or point of discussion, so you can dive right into whatever particular issues you may be interested in. Here, for example, is a conversation on “Mattering and Secular Communities,” featuring Rebecca Goldstein, Dan Dennett, and Owen Flanagan.

    Mattering and Secular Communities: Rebecca Goldstein et al

    The videos can be seen on the workshop web page, or on my YouTube channel. They’re divided into categories:

    A lot of good stuff in there. Enjoy!

  • Stephen Hawking’s Scientific Legacy

    Stephen Hawking died Wednesday morning, age 76. Plenty of memories and tributes have been written, including these by me:

    I can also point to my Story Collider story from a few years ago, about how I turned down a job offer from Hawking, and eventually took lessons from his way of dealing with the world.

    Of course Hawking has been mentioned on this blog many times.

    When I started writing the above pieces (mostly yesterday, in a bit of a rush), I stumbled across this article I had written several years ago about Hawking’s scientific legacy. It was solicited by a magazine at a time when Hawking was very ill and people thought he would die relatively quickly — it wasn’t the only time people thought that, only to be proven wrong. I’m pretty sure the article was never printed, and I never got paid for it; so here it is!

    (If you’re interested in a much better description of Hawking’s scientific legacy by someone who should know, see this article in The Guardian by Roger Penrose.)

    Stephen Hawking’s Scientific Legacy

    Stephen Hawking is the rare scientist who is also a celebrity and cultural phenomenon. But he is also the rare cultural phenomenon whose celebrity is entirely deserved. His contributions can be characterized very simply: Hawking contributed more to our understanding of gravity than any physicist since Albert Einstein.

    “Gravity” is an important word here. For much of Hawking’s career, theoretical physicists as a community were more interested in particle physics and the other forces of nature — electromagnetism and the strong and weak nuclear forces. “Classical” gravity (ignoring the complications of quantum mechanics) had been figured out by Einstein in his theory of general relativity, and “quantum” gravity (creating a quantum version of general relativity) seemed too hard. By applying his prodigious intellect to the most well-known force of nature, Hawking was able to come up with several results that took the wider community completely by surprise.

    By acclimation, Hawking’s most important result is the realization that black holes are not completely black — they give off radiation, just like ordinary objects. Before that famous paper, he proved important theorems about black holes and singularities, and afterward studied the universe as a whole. In each phase of his career, his contributions were central.

    (more…)