Humanity

The Future of Democratic Values

Hey, did you know we are having an election here in the United States? I think I saw it mentioned on TV. Whatever your preferences may be, everyone eligible should try to get out and vote.

This election has, without a doubt, been somewhat unique. I’m cautiously optimistic that Hillary Clinton will win, that we will celebrate the election of the first female President in the history of the republic, and that she will do a relatively good job — although as a good Bayesian I know that empirical predictions are never certain, and in an atmosphere like this uncertainty runs relatively high.

Even if Clinton wins and the U.S. avoids complete embarrassment, I’m still very worried about what this election has revealed about the state of the country. No matter who our next President might be, there are real reasons to be concerned that the U.S. is veering away from some of the foundational principles that are necessary to a functioning democracy. That may sound alarmist, but I don’t think it’s unwarranted. Historically, democracies don’t always last forever; we’d be foolish to think that it can’t happen here.

This isn’t a worry about the specific horrible wrongness of Donald Trump — it’s a worry about the forces that propelled him to the nomination of one of our two major political parties, and the fires he so willingly stoked along the way. Just as a quick and hopelessly incomplete recap:

  • Trump built his early political notoriety via “birtherism,” explicitly working to undermine the legitimacy of our elected President.
  • He has continually vilified immigrants and foreigners generally, promoting an us-against-them mentality between people of different races and ethnicities.
  • He has pledged to violate the Constitutional principle of freedom of religion, from banning Muslims from entering the country to tracking ones that are here.
  • His campaign, and the Republican party more generally, has openly engaged in suppressing the vote from groups unlikely to support him. (“‘We have three major voter suppression operations underway,’ says a senior [Trump] official.”)
  • He has glorified violence against protesters who disagree with him.
  • He has lied at an unprecedented, astonishing rate, secure in the knowledge that his statements will be taken as true by a large fraction of his intended audience.
  • He has presented himself as a uniquely powerful strongman who can solve problems through his personal force of will, and spoke admiringly of dictators from Vladimir Putin to Kim Jong-un to Saddam Hussein.
  • He has vowed that if he wins the election, he will seek vengeance on those who opposed him, including throwing his opponent into prison.
  • He has repeatedly cast doubt on the legitimacy of the election outcome, implying that he would refuse to accept the result if he lost.
  • He has pointed fingers at a shadowy global conspiracy in charge of world finance, often with explicitly anti-Semitic overtones.
  • Several Republican politicians have broached the prospect of refusing to confirm any Supreme Court nominees from a Democratic President.
  • A government agency, the FBI, has interfered in a Presidential election.
  • Republicans have accused Democratic officeholders of being traitors.
  • A number of Trump supporters have spoken of the prospect of violent resistance if Clinton is elected.

This is not a list of “why Donald Trump is a bad person who is disastrously unqualified for the Presidency”; that would be much longer. Rather, I wanted to highlight features of the campaign that are specifically attacks on (small-“d”) democratic norms and values. The assumptions, often unspoken, by which legitimate political opponents have generally agreed to operate by over the course of the last two centuries and more. Not all of them, of course; there are glaring exceptions, authoritarians who have run roughshod over one or more of these norms in the name of personal glory. History generally looks down upon them, and we consider ourselves fortunate that they didn’t have greater success. But fortune can run out.

The most worrisome aspect of the situation is the very real prospect that these attacks on the foundations of liberal democracy will not simply disappear once Donald Trump rides off into the gold-plated sunset; that they will be seized upon and deployed by other politicians who couldn’t help but notice Trump’s success. If that’s the case, we will have a real reason to be concerned that American democracy will stop working, perhaps sooner rather than later. I don’t think it’s likely that such a disastrous scenario would come to pass, but one has to balance the small likelihood against the devastating consequences — and right now the probability seems closer to 0.05 than to 10-5.

Democracy is a curious and fragile thing. It’s not just “majority rules”; crucial to the project are the ideas that (1) minority rights are still respected, and (2) in return, losing minorities respect electoral outcomes. It’s the second of these that is under siege at the moment. Since the time of the Federalist Papers, it’s been understood that democracy is an attempt to provide common self-rule for people who don’t agree on everything, but who at least share the common values of democracy itself. Having strong, even extremely passionate, political disagreements is inevitable in a democratic system. The question is whether we cast those with whom we disagree as enemies, traitors, and cheaters who must be opposed in every measure at every turn; or as partners in a grand project with whom we can fiercely disagree and yet still work with.

I don’t claim to have a complete understanding of how we got to this precarious point, though there are a number of factors that certainly have contributed. …

72 Comments

Youthful Brilliance

A couple of weeks ago I visited the Texas A&M Physics and Engineering Festival. It was a busy trip — I gave a physics colloquium and a philosophy colloquium as well as a public talk — but the highlight for me was an hourlong chat with the Davidson Young Scholars, who had traveled from across the country to attend the festival.

The Davidson Young Scholars program is an innovative effort to help nurture kids who are at the very top of intellectual achievement in their age group. Every age and ability group poses special challenges to educators, and deserves attention and curricula that are adjusted for their individual needs. That includes the most high-achieving ones, who easily become bored and distracted when plopped down in an average classroom. Many of them end up being home-schooled, simply because school systems aren’t equipped to handle them. So the DYS program offers special services, including most importantly a chance to meet other students like themselves, and occasionally go out into the world and get the kind of stimulation that is otherwise hard to find.

carroll-davidson-scholars

These kids were awesome. I chatted just very briefly, telling them a little about what I do and what it means to be a theoretical physicist, and then we had a free-flowing discussion. At some point I mentioned “wormholes” and it was all over. These folks love wormholes and time travel, and many of them had theories of their own, which they were eager to come to the board and explain to all of us. It was a rollicking, stimulating, delightful experience.

You can see from the board that I ended up talking about Einstein’s equation. Not that I was going to go through all of the mathematical details or provide a careful derivation, but I figured that was something they wouldn’t typically be exposed to by either their schoolwork or popular science, and it would be fun to give them a glimpse of what lies ahead if they study physics. Everyone’s life is improved by a bit of exposure to Einstein’s equation.

The kids are all right. If we old people don’t ruin them, the world will be in good hands.

14 Comments

Being: Human

cover6-150Anticipation is growing — in my own mind, if nowhere else — for the release of The Big Picture, which will be out on May 10. I’ve finally been able to hold the physical book in my hand, which is always a highlight of the book-writing process. And yes, there will be an audio book, which should come out the same time. I spent several days in March recording it, which taught me a valuable lesson: write shorter books.

There will also be something of a short book tour, hitting NYC, DC, Boston, Seattle, and the Bay Area, likely with a few other venues to be added later this summer. For details see my calendar page.

In many ways, the book is a celebration of naturalism in general, and poetic naturalism in particular. So to get you in the mood, here is a lovely short video from the Mothlight Creative, which celebrates naturalism in a more visual and visceral way. “I want to shiver with awe and wonder at the universe.”

Being: Human from Mothlight Creative on Vimeo.

19 Comments

From Child Preacher to Adult Humanist

One of the best experiences I had at last year’s Freedom From Religion Foundation convention was listening to this wonderful talk by Anthony Pinn. (Talk begins at 5:40.)

Anthony Pinn: The End of God-Talk

Pinn, growing up outside Buffalo NY, became a preacher in his local church at the ripe young age of 12. Now, there’s nothing an audience of atheists likes better than a story of someone who was devoutly religious and later does an about-face to embrace atheism. (Not an uncommon path, with many possible twists, as you can read in Daniel Dennett and Linda LaScola’s Caught in the Pulpit.) And Pinn gives it to us, hitting all the best notes: being in love with Jesus but also with thinking critically, being surprised to meet theologians who read the Bible as literature rather than as The Word, and ultimately losing his faith entirely while studying at Harvard Divinity School.

But there’s a lot more to his message than a congratulatory triumph of rationality over superstition. Through his life, Pinn has been concerned with the effect that ideas and actions have on real people, especially the African-American community. His mother always reminded him to “move through the world knowing your footsteps matter,” valuable advice no matter what your ontological orientation might be.

This comes out in the Q&A period — often not worth listening to, but in this case it’s the highlight of the presentation. The audience of atheists are looking for yet more self-affirmation, demanding to know why more Blacks haven’t accepted the truth of a secular worldview. Pinn is very frank: naturalism hasn’t yet offered African-Americans a “soft landing.” Too many atheists, he points out, spend a lot of time critiquing religious traditions, and a lot of time patting themselves on the back for being rational and fair-minded, and not nearly enough time constructing something positive, a system of networks and support structures free of the spiritual trappings. It’s a good message for us to hear.

It would have been fantastic to have Anthony at Moving Naturalism Forward. Next time! (Not that there are currently any plans for a next time.)

40 Comments

Life Is the Flame of a Candle

Emperor Has No Clothes Award Last October I was privileged to be awarded the Emperor Has No Clothes award from the Freedom From Religion Foundation. The physical trophy consists of the dashing statuette here on the right, presumably the titular Emperor. It’s made by the same company that makes the Academy Award trophies. (Whenever I run into Meryl Streep, she’s just won’t shut up about how her Oscars are produced by the same company that does the Emperor’s New Clothes award.)

Part of the award-winning is the presentation of a short speech, and I wasn’t sure what to talk about. There are only so many things I have to say, but it’s boring to talk about the same stuff over and over again. More importantly, I have no real interest in giving religion-bashing talks; I care a lot more about doing the hard and constructive work of exploring the consequences of naturalism.

So I decided on a cheerful topic: Death and Physics. I talked about modern science gives us very good reasons to believe (not a proof, never a proof) that there is no such thing as an afterlife. Life is a process, not a substance, and it’s a process that begins, proceeds along for a while, and comes to an end. Certainly something I’ve said before, e.g. in my article on Physics and the Immortality of the Soul, and in the recent Afterlife Debate, but I added a bit more here about entropy, complexity, and what we mean by the word “life.”

If you’re in a reflective mood, here it is. I begin at around 3:50. One of the points I tried to make is that the finitude of life has its upside. Every moment is precious, and what we should value is what is around us right now — because that’s all there is. It’s a scary but exhilarating view of the world.

Sean Carroll: Has Science Refuted Religion

79 Comments

How to Communicate on the Internet

Let’s say you want to communicate an idea X.

You would do well to simply say “X.”

Also acceptable is “X. Really, just X.”

A slightly riskier strategy, in cases where miscomprehension is especially likely, would be something like “X. This sounds a bit like A, and B, and C, but I’m not saying those. Honestly, just X.” Many people will inevitably start arguing against A, B, and C.

Under no circumstances should you say “You might think Y, but actually X.”

Equally bad, perhaps worse: “Y. Which reminds me of X, which is what I really want to say.”

For examples see the comment sections of the last couple of posts, or indeed any comment section anywhere on the internet.

It is possible these ideas may be of wider applicability in communication situations other than the internet.

(You may think this is just grumping but actually it is science!)

23 Comments

Troublesome Speech and the UIUC Boycott

Self-indulgently long post below. Short version: Steven Salaita, an associate professor of English at Virginia Tech who had been offered and accepted a faculty job at the University of Illinois Urbana-Champaign, had his offer rescinded when the administration discovered that he had posted inflammatory tweets about Israel, such as “At this point, if Netanyahu appeared on TV with a necklace made from the teeth of Palestinian children, would anybody be surprised? #Gaza.” Many professors in a number of disciplines, without necessarily agreeing with Salaita’s statements, believe strongly that academic norms give him the right to say them without putting his employment in jeopardy, and have organized a boycott of UIUC in response. Alan Sokal of NYU is supporting the boycott, and has written a petition meant specifically for science and engineering faculty, who are welcome to sign if they agree.

Everyone agrees that “free speech” is a good thing. We live in a society where individual differences are supposed to be respected, and we profess admiration for the free market of ideas, where competing claims are discussed and subjected to reasonable critique. (Thinking here of the normative claim that free speech is a good thing, not legalistic issues surrounding the First Amendment and government restrictions.) We also tend to agree that such freedom is not absolute; you don’t have the right to come into my house (or the comment section of my blog) and force me to listen to your new crackpot theory of physics. A newspaper doesn’t have an obligation to print something just because you wrote it. Biology conferences don’t feel any need to give time to young-Earth creationists. In a classroom, teachers don’t have to sit quietly if a student wants to spew blatantly racist invective (and likewise for students while teachers do so).

So there is a line to be drawn, and figuring out where to draw it isn’t an easy task. It’s not hard to defend people’s right to say things we agree with; the hard part is defending speech we disagree with. And some speech, in certain circumstances, really isn’t worth defending — organizations have the right to get rid of employees who are (for example) consistently personally abusive to their fellow workers. The hard part — and it honestly is difficult — is to distinguish between “speech that I disagree with but is worth defending” and “speech that is truly over the line.”

To complicate matters, people who disagree often become — how to put this delicately? — emotional and polemical rather than dispassionate and reasonable. People are very people-ish that way. Consequently, we are often called upon to defend speech that we not only disagree with, but whose tone and connotation we find off-putting or even offensive. Those who would squelch disagreeable speech therefore have an easy out: “I might not agree with what they said, but what I really can’t countenance is the way they said it.” If we really buy the argument that ideas should be free and rational discourse between competing viewpoints is an effective method of discovering truth and wisdom, we have to be especially willing to defend speech that is couched in downright objectionable terms.

As an academic and writer, in close cases I will almost always fall on the side of defending speech even if I disagree with it (or how it is said). Recently several different cases have illustrated just how tricky this is — but in each case I think that the people in question have been unfairly punished for things they have said. …

76 Comments

The Branch We Were Sitting On

barnes_julian-19911205025R.2_png_380x600_crop_q85In the latest issue of the New York Review, Cathleen Schine reviews Levels of Life, a new book by Julian Barnes. It’s described as a three-part meditation on grief, following the death of Barnes’s wife Pat Kavanagh.

One of the things that is of no solace to Barnes (and there are many) is religion. He writes:

When we killed–or exiled–God, we also killed ourselves…. No God, no afterlife, no us. We were right to kill Him, of course, this long-standing imaginary friend of ours. And we weren’t going to get an afterlife anyway. But we sawed off the branch we were sitting on. And the view from there, from that height–even if it was only an illusion of a view–wasn’t so bad.

I can’t disagree. Atheists often proclaim the death of God in positively gleeful terms, but it’s important to recognize what was lost–a purpose in living, a natural place in the universe. The loss is not irretrievable; there is nothing that stops us from creating our own meaning even if there’s no supernatural overseer to hand one to us. But it’s a daunting task, one to which we haven’t really faced up.

77 Comments

Billions of Worlds

I’m old enough to remember when we had nine planets in the Solar System, and zero outside. The news since then has been mixed. Here in our neighborhood we’re down to only eight planets; but in the wider galaxy, we’ve obtained direct evidence for about a thousand, with another several thousand candidates. [Thanks to Peter Edmonds for a correction there.] Now that we have real data, what used to be guesswork gives way to best-fit statistical inference. How many potentially habitable planets are there in the Milky Way, given some supposition about what counts as “habitable”? Well, there are about 200 billion stars in the galaxy. And about one in five are roughly Sun-like. And now our best estimate is that about one in five of them has a somewhat Earth-like planet. So you do the math: about eight billion Earth-like planets. (Here’s the PNAS paper, by Petigura, Howard, and Marcy.)

kepler

“Earth-like” doesn’t mean “littered with human-esque living organisms,” of course. The number of potentially habitable planets is a big number, but to get the number of intelligent civilizations we need to multiply by the fraction of such planets that are home to such civilizations. And we don’t know that.

It’s surprising how many people resist this conclusion. To drive it home, consider a very simplified model of the Drake equation.

x = a \cdot b.

x equals a times b. Now I give you a, and ask you to estimate x. Well, you can’t. You don’t know b. In the abstract this seems obvious, but there’s a temptation to think that if a (the number of Earth-like planets) is really big, then x (the number of intelligent civilizations) must be pretty big too. As if it’s just not possible that b (the fraction of Earth-like planets with intelligent life) could be that small. But it could be! It could be 10-100, in which case there could be billions of Earth-like planets for every particle in the observable universe and still it would be unlikely that any of the others contained intelligent life. Our knowledge of how easy it is for life to start, and what happens once it does, is pretty pitifully bad right now.

On the other hand — maybe b isn’t that small, and there really are (or perhaps “have been”) many other intelligent civilizations in the Milky Way. No matter what UFO enthusiasts might think, we haven’t actually found any yet. The galaxy is big, but its spatial extent (about a hundred thousand light-years) is not all that forbidding when you compare to its age (billions of years). It wouldn’t have been that hard for a plucky civilization from way back when to colonize the galaxy, whether in person or using self-replicating robots. It’s not the slightest bit surprising (to me) that we haven’t heard anything by pointing radio telescopes at the sky — beaming out electromagnetic radiation in all directions seems like an extraordinarily wasteful way to go about communicating. Much better to send spacecraft to lurk around likely star systems, à la the monolith from 2001. But we haven’t found any such thing, and 2001 was over a decade ago. That’s the Fermi paradox — where is everyone?

It isn’t hard to come up with solutions to the Fermi paradox. Maybe life is just rare, or maybe intelligence generally leads to self-destruction. I don’t have strong feelings one way or another, but I suspect that more credence should be given to a somewhat disturbing possibility: the Enlightentment/Boredom Hypothesis (EBH).

The EBH is basically the idea that life is kind of like tic-tac-toe. It’s fun for a while, but eventually you figure it out, and after that it gets kind of boring. Or, in slightly more exalted terms, intelligent beings learn to overcome the petty drives of the material world, and come to an understanding that all that strife and striving was to no particular purpose. We are imbued by evolution with a desire to survive and continue the species, but perhaps a sufficiently advanced civilization overcomes all that. Maybe they perfect life, figure out everything worth figuring out, and simply stop.

I’m not saying the EBH is likely, but I think it’s on the table as a respectable possibility. The Solar System is over four billion years old, but humans reached behavioral modernity only a few tens of thousands of years ago, and figured out how to do science only a few hundred years ago. Realistically, there’s no way we can possibly predict what humanity will evolve into over the next few hundreds of thousands or millions of years. Maybe the swashbuckling, galaxy-conquering impulse is something that intelligent species rapidly outgrow or grow tired of. It’s an empirical question — we should keep looking, not be discouraged by speculative musings for which there’s little evidence. While we’re still in swashbuckling mode, there’s no reason we shouldn’t enjoy it a little.

60 Comments

Don’t Start None, Won’t Be None

[Final update: DNLee’s blog post has been reinstated at Scientific American. I’m therefore removing it from here; traffic should go to her.]

[Update: The original offender, “Ofek” at Biology Online, has now been fired, and the organization has apologized. Scientific American editor Mariette DiChristina has also offered a fuller explanation.]

Something that happens every day, to me and many other people who write things: you get asked to do something for free. There’s an idea that mere “writing” isn’t actually “work,” and besides which “exposure” should be more than enough recompense. (Can I eat exposure? Can I smoke it?)

You know, that’s okay. I’m constantly asking people to do things for less recompense than their time is worth; it’s worth a shot. For a young writer who is trying to build a career, exposure might actually be valuable. But most of the time the writer will politely say no and everyone will move on.

For example, just recently an editor named “Ofek” at Biology-Online.org asked DNLee to provide some free content for him. She responded with:

Thank you very much for your reply.
But I will have to decline your offer.
Have a great day.

Here’s what happens less often: the person asking for free content, rather than moving on, responds by saying

Because we don’t pay for blog entries?
Are you an urban scientist or an urban whore?

Where I grew up, when people politely turn down your request for free stuff, it’s impolite to call them a “whore.” It’s especially bad when you take into account the fact that we live in a world where women are being pushed away from science, one where how often your papers get cited correlates strongly with your gender, and so on.

DNLee was a bit taken aback, with good reason. So she took to her blog to respond. It was a colorful, fun, finely-crafted retort — and also very important, because this is the kind of stuff that shouldn’t happen in this day and age. Especially because the offender isn’t just some kid with a website; Biology Online is a purportedly respectable site, part of the Scientific American “Partners Network.” One would hope that SciAm would demand an apology from Ofek, or consider cutting their ties with the organization.

Sadly that’s not what happened. If you click on the link in the previous paragraph, you’ll get an error. That’s because Scientific American, where DNLee’s blog is hosted, decided it wasn’t appropriate and took it down.

It’s true that this particular post was not primarily concerned with conveying substantive scientific content. Like, you know, countless other posts on the SciAm network, or most other blogs. But it wasn’t about gossip or what someone had for lunch, either; interactions between actual human beings engaged in the communication of scientific results actually is a crucial part of the science/culture/community ecosystem. DNLee’s post was written in a jocular style, but it wasn’t only on-topic, it was extremely important. Taking it down was exactly the wrong decision.

I have enormous respect for Scientific American as an institution, so I’m going to hope that this is a temporary mistake, and after contemplating a bit they decide to do the right thing, restoring DNLee’s post and censuring the guy who called her a whore. But meanwhile, I’m joining others by copying the original post here. Ultimately it’s going to get way more publicity than it would have otherwise. Maybe someday people will learn how the internet works.

Here is DNLee. (Words cannot express how much I love the final picture.)

——————————————————–

(This is where I used to mirror the original blog post, which has now been restored.)

84 Comments
Scroll to Top