Episode 1: Carol Tavris on Mistakes, Justification, and Cognitive Dissonance

For the first full episode of Mindscape, it's an honor to welcome social psychologist Carol Tavris. Her book with co-author Eliot Aronson, Mistakes Were Made (But Not By Me), explores the effect that cognitive dissonance has on how we think. We talk about the fascinating process by which people justify the mistakes that they make, and how that leads to everything from false memories to political polarization.

Carol Tavris received her Ph.D. in social psychology from the University of Michigan. She is the author of numerous books, covering topics such as gender, biology, and emotion, and is a frequent contributor to a variety of newspapers and magazines. She is a Fellow of the American Psychological Association, the Association for Psychological Science and the Committee for Skeptical Inquiry.

Download Episode

0:00:00 Sean Carroll: Hello, everyone, and welcome to the Mindscape podcast. I'm your host, Sean Carroll, and I have a confession to make. Namely, I have made mistakes. Over the course of my career as a scientist and a writer; for that matter, over the course of my life as a human being, when faced with decisions I have sometimes made the wrong choice. Maybe you have too and happily, we have on today's podcast, the perfect person to talk about this idea that people make mistakes, Dr. Carol Tavris, who's one of the world's experts, not on making mistakes or the ways that we make mistakes, but on what happens after we make mistakes.

0:00:40 SC: In a masterful book that she wrote with her co-author Elliot Aronson called Mistakes Were Made But Not By Me: Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts, Dr. Tavris talks about the idea of cognitive dissonance and how we are forced by cognitive dissonance to come up with excuses and justifications for the mistakes that we make. We might like to imagine that we are perfectly rational, reasonable beings, but the psychological truth is very much the opposite of that. We make up reasons why something was the right choice after all, or why we couldn't have had any other option but to make it.

0:01:19 SC: Cognitive dissonance theory explains why we do this, and in some sense, it's the mother of all cognitive biases. It helps us understand why we're not the perfect rational, reasonable people that we pretend to be. And Carol Tavris is a perfect person to talk about this with, she's the author of several popular books and also a famous text book on psychology that you might have used in your intro Psych course way back in the day. Other books include Psychobabble and Biobunk: Using Psychology to Think Critically About Issues in the News; The Mismeasure of Woman: Why Women are not the Better Sex, the Inferior Sex, or the Opposite Sex. And in September, she has a new book coming out called Estrogen Matters: Why Taking Hormones in Menopause Can Improve Women's Well-Being and Lengthen Their Lives, Without Raising the Risk of Breast Cancer. Now, I have to confess, I'm not an expert on either estrogen or menopause or hormones, so that's not what we're talking about in this podcast. But then again, maybe that is a mistake that I'm making, that's what we will be talking about, how we make mistakes. So maybe I should read up on estrogen and hormones, then we can bring Dr. Tavris back, and have a conversation about that. Today we're going to be talking about cognitive dissonance and why it makes us think that we never actually really make mistakes. So let's start.

[music]

0:02:58 SC: Carol Tavris, welcome to the Mindscape podcast.

0:03:00 Carol Tavris: Very happy to be talking with you.

0:03:02 SC: So you are a social psychologist, right? Is that the correct descriptor?

0:03:06 CT: It is.

0:03:06 SC: As opposed to cognitive psych... What are the different kinds of psychologist one could be?

0:03:10 CT: Alright, hardly anyone knows what a social psychologist is. They think you're a therapist who gives a lot of parties, you're just very sociable. [chuckle]

0:03:18 SC: You are social.

0:03:18 CT: You are social, that's what it must be. Social psychology was a field born at the intersection of sociology, the study of groups and society and the individual. And fundamentally, it is the study of every social influence on human beings, which is to say everything. Influences on how we think, how we behave, influences of the environment on our behavior, influences of our thinking culture. The thing that's interesting about social psychology to me is that it's the field that really looks at something that many Americans don't like, which is the idea that we really are influenced by our environments, in ways that we don't realize and don't really want to realize because, after all, we're all individualists, we make up our own mind, we're very independent.

0:04:11 SC: The American dream. Yeah.

0:04:12 CT: The American dream. So I always found social psychology wonderfully interesting and rich because there's no topic it does not cover from love to hate. My dear co-author and very good friend Elliot Aronson said that when he was a student, he was drawn to social psychology, because clinical psychology is about repair, but social psychology is about change. It's a more optimistic approach to the world, because by understanding how we as individuals and how what is happening in our world influence us, we can actually take some steps to improve things.

0:04:53 SC: But you don't see patients?

0:04:54 CT: No.

0:04:56 SC: So you're not... That would be a clinical psychologist.

0:04:57 CT: That would be a clinical psychologist. The field of psychology, we actually speak about the difference between the way clinical psychologists are trained, or I should say psychotherapists are trained and the way psychological scientists are trained. This creates what we speak of as the scientist-practitioner gap, if you will, because many psychotherapists which, by the way, is an unregulated word. You could put out a shingle tomorrow doing broccoli therapy, and it would be completely legal, but please don't do that.

0:05:25 SC: I might do that.

0:05:27 CT: Not broccoli, not broccoli, really. So, psychotherapy, anybody could be anything, and call themselves a psychotherapist. Clinical psychologists have a PhD in that field of clinical psychology and their field is to study psychological disorders and problems. Their training varies from very good scientific training to zero, okay. So the rest of psychology, the psychological sciences, if you will, specify particular areas of study, so developmental psychologists study development from the infant to death. Social psychologists do what I just described. Cognitive psychologists study the mind, thinking, memory and so forth. So those are different specialties within psychological science. But most people to this day when they hear "psychologist," they think you're a therapist.

0:06:19 SC: Yeah, they think that there's a couch in your house, in your...

0:06:20 CT: Absolutely. And not just a therapist, by the way, the couches have been out for decades, decades...

0:06:25 SC: Maybe you could bring the couches back.

0:06:26 CT: Well, there you go.

0:06:27 SC: I mean...

0:06:27 CT: Your shingle, your therapy can have the couch if you want.

0:06:30 SC: Right, I don't... How about chocolate instead of broccoli?

0:06:32 CT: Well...

0:06:32 SC: Chocolate therapy, couches, candles, the whole bit.

0:06:35 CT: In our textbook with Carole Wade... That Carole Wade and I did for so many years, we actually proposed chocolate immersion therapy as probably the most beneficial thing you could do.

0:06:45 SC: I like that.

0:06:45 CT: If I weren't so damned ethical, I could be really rich.

0:06:48 SC: We all have that problem, exactly. And so, you have various arrows in your quiver. I know you've written a number of books. But you've written one of my favorite books of all time, Mistakes Were Made But Not By Me, and this is on the subject of cognitive dissonance. So why don't you tell us a little bit about number one, you have a co-author, who you already mentioned, Elliot Aronson, and how the book came to be and what cognitive dissonance is.

0:07:14 CT: In one sentence or three?

0:07:15 SC: You get all the hours you want.

0:07:17 CT: I have many sentences. Well, so I guess first I should say what cognitive dissonance is, every social psychology student learned this. I did when I was in graduate school a million years ago. It's a simple mechanism but it's developed over the decades into a really rich theory of understanding. Why it is that when you tell somebody in a kind and warm and friendly way, "Here's some information that shows why you're wrong," they rarely thank you. They tell you where you can go with your findings and what you can do with them and they're just not inclined to be grateful.

0:07:49 SC: I have no idea what you're talking about. I've never experienced this.

0:07:50 CT: I have... No, you don't, you've never experienced this. So you know, my friends still say to me to this day, "Oh, Carol, you're so delusionally dear to think that just by explaining to people, here's an important finding you need to know, that they are not always grateful." So all that cognitive dissonance really means is, that is the uncomfortable feeling we have when two ideas clash with each other, conflict with each other. It can be two ideas, two beliefs or it can be, more commonly, an attitude and a behavior. So the classic example is smoking. A person who smokes knows that smoking is dangerous and stupid. So they will be in a state of dissonance, "I smoke, but I'm doing something dangerous and stupid." This discomfort is cognitive dissonance and Leon Festinger, who developed this theory in the late '50s, described it as being as uncomfortable as being hungry or thirsty. It's so uncomfortable that you're really motivated to reduce it in whatever way you can. Well, if you're a smoker, you need to quit smoking or you need to justify smoking. You need to say, "Well, it's unhealthy. But I'll be thin and being thin is good. And besides that, Myrtle lived to be 97."

0:09:09 SC: "I look cool."

0:09:10 CT: "And I look really cool." So cognitive dissonance... You know, I don't think I took it terribly seriously. I thought it was really an interesting theory, but I didn't have much appreciation of its depth of application. And then, after George Bush got us into the Iraq War, Elliot and I were sitting around talking about...

0:09:33 SC: Sorry, which George Bush and which Iraq War are we...

0:09:37 CT: Oh, Dubya. Dubya. Yeah, so this was, yeah, the 2003. So several years later, when it was abundantly clear that the justifications for going into Iraq were completely wrong, there were no weapons of mass destruction and so forth. And of course the whole country had noticed that Bush did not say, "Gee, we were wrong. So sorry, we made a really bad mistake about spending a trillion dollars in going into Iraq." Instead, he found other justifications for the war, "Well, we're bringing democracy to the region," and so forth. So, Elliot and I are having this conversation, and he said, "You know," he said, "I disagree with those Democrats who thought that Bush was lying to the country." He said, "I don't think he was lying to the country. I think he was lying to himself. He was doing what all of us do when we have made a decision to do something, is we screen out discrepant, dissonant information that suggests our decision is wrong. We focus, we cherry pick the information that tells us our decision is the right one and we go forth. So, we make the decision and then we justify it."

0:10:50 CT: He said, "I think that George Bush had made the decision to go into Iraq and therefore ignored the arguments from his own intelligence community that that might not be a good thing to do." And I said, "My goodness," I said. And from that conversation, because that war was so devastating to our country and so palpably wrong, we thought, "Well, how do we want to analyze this? How do we want to talk about this?" And from that conversation came the idea to do a book that would apply the many domains of our lives in which the need to reduce dissonance gets us into terrible, terrible trouble. Not just politically, but in our relationships, in our understanding of memory, in therapy, in really every aspect of our lives.

0:11:44 SC: Right. Right. Yeah, so one of the things... Maybe we could do an entire podcast on just this, but, your book is filled with these wonderful juicy examples of people being in possession of an idea, faced with new information that conflicts with that idea, and coming up with a justification. I have many questions, but let's just give the reward to our podcast listeners by telling them some of these juiciest examples in love, in the law, in... Like you said, politics already. Doomsday cults is one of the best.

0:12:17 CT: Doomsday cults. Well, doomsday cults... Yes, two filmmakers in England actually made a little film about a doomsday guy who predicted the end of the world. Doomsday is a particularly good story, because it was one of the stories that got Leon Festinger involved with his whole notion of cognitive dissonance, which was this. He got involved with a cult many years ago.

0:12:41 SC: I'm sorry, he was the originator of the idea, right?

0:12:43 CT: Yes, Leon Festinger. He was a social psychologist at Stanford.

0:12:46 SC: So, what years are we talking here?

0:12:48 CT: In the late '50s, late '50s. And Festinger and two colleagues infiltrated a doomsday cult, in which a woman, whom they called Marian Keech, predicted that the world would end on December 21st of that year. And they joined this group, and what they found was that...

0:13:05 SC: So when they joined the group, they were pretending to...

0:13:07 CT: Pretending to be believers. There were several people... There were many people who believed the world was going to come to an end. Now, if you believe the world is going to come to an end, you could do one of two things. You could sit quietly in your house and wait for the world to end, or you could proselytize. You could sell your house and your cow and go off and live with Marian Keech, go be with her on the night of the great event.

0:13:30 SC: Or chocolate immersion therapy.

0:13:33 CT: Chocolate... It wasn't invented yet.

0:13:34 SC: Alright, okay.

0:13:35 CT: What's the matter with you?

0:13:36 SC: If you invent a doomsday cult, I will offer them this...

0:13:39 CT: Fine, it's...

0:13:39 SC: Therapy.

0:13:39 CT: A lot healthier and safer by the way. Doomsday cult. So they go to sit with her on the night of December 21st. Meanwhile, her husband slept soundly through the whole night, having not believed one single word of this. Fine. At 2:00 AM, there's no sign of any magic spaceship coming to whisk them away and save them. You see, they were going to be saved because they understood that the world was coming to an end. No spaceship, nothing is happening, what to do, what to do. And finally, very late, Mrs. Keech has a vision, and the vision is that the world has been saved because of the devout belief of her little band.

0:14:26 CT: This is a lot better way to reduce dissonance between, "I just made the stupidest mistake I could ever have made, what kind of an idiot am I to have made this prediction?" This allows you to think of yourself as a smart, kind and wonderful person and have your prediction fail. This is the essence of what Elliot Aronson did in advancing the theory of dissonance, by turning it into a theory of self-justification. Two ideas can be dissonant. You like Woody Allen films and your friend doesn't like Woody Allen films, and so you might try to decide your friend is... It's okay to be stupid about Woody Allen films, or whatever it might be.

0:15:11 CT: But the dissonance that most stings is when information about a belief or something that we've done directly causes us to question our self-concept, something about ourselves that we feel is really important about ourselves. If you criticize something about me that I don't consider important or I don't think I'm good at, well, so what?

0:15:36 SC: You can deal with that.

0:15:37 CT: But if you show me how I have not been skeptical about something I really should have been skeptical about, I have let gullibility take over, I'm going to be really embarrassed. And I'm going to try to reduce that dissonance, either by defending myself and by dismissing the evidence that I've been a gullible fool. So that's why it's so important to understand how dissonance works. Because when the self-concept is threatened, I'm a good, worthy, smart person, and now you're telling me I've done something bad, cruel, harmful and wrong, well, piss off and take your information with you.

0:16:15 CT: To me, one of the most important findings of this theory is that it explains why people who are good, compassionate, caring people will often find themselves continuing practices that are wrong, harmful and outdated, precisely because they see themselves as good, kind, compassionate people. I'm a doctor; you're telling me that this thing I've been doing in my practice for 20 years is harming my patients? The hell it is. I'm a good, competent doctor so I'm going to keep doing this.

0:16:51 SC: And you give examples, for example of, in the legal system, people being found guilty by prosecutors, but then obviously exculpatory evidence comes along. Clearly, mistakes have been made, and yet the people who did the prosecutions or the detectives who arrested the person in the first place don't see it that way.

0:17:11 CT: They do not, because they're the good guys. By definition, prosecutors and police are the good guys. We're the ones in the white hats and those bad people are bad, "And now you're telling me that I, a good person, put another good person in prison? Well, that's not even possible. Therefore, okay, if he didn't commit this particular crime, he certainly committed other crimes. And so I'm not doing anything wrong anyway." That truly is their reasoning, the inability to... Of course, there are many venal reasons that people might not admit mistakes. That's not what we're talking about. People will fail to admit a mistake because they don't want to be fired, they don't want to be fined, they don't want to go to prison. Those are obvious reasons that people lie. Dissonance explains why we lie to ourselves, that's why it's so pernicious, and so dangerous, and so important to understand.

0:18:09 SC: And so, we've done politics, and the law, and doomsday cults, love lives, marriages.

0:18:18 CT: Oh, love lives.

0:18:18 SC: Has it ever happened that we are dissonant in a personal...

0:18:20 CT: Oh, I want to say... No, of course, not. Okay. One more thing about the law, though, because this is so important, throughout the legal system and in the military, once people believe that they are interrogating a guilty person, nothing that person does or says will dis-confirm their belief. This is true of the psychotherapists and social workers who believed that they could tell if a child had been sexually abused, say, if the child admitted it, well, the child had been abused, but if the child denied it, well, the child was in denial. So there was no way... If you are convinced that you know what has really happened in a situation with a child or a suspect or a convict or anybody, everything that person does will confirm your belief to make your belief consonant with what you're doing. It's the way that many police officers or even trained detectives are trained in interviewing is, if a suspect seems nervous, that's a sign of guilt. What if it's a sign of nervousness because you've just been arrested and you're innocent.

0:19:36 SC: Yeah. There might be alternative explanations.

0:19:38 CT: Exactly, which they are not trained to think of, that's right.

0:19:40 SC: And that parallels, what do you said about George W and the Iraq War. And, but clearly it's not just high stakes courtroom dramas in our everyday lives, in our personal lives and marriages. This is going to be something where our partner accuses us of a certain kind of behavior. Maybe there's a twinge of truth to that accusation and we're able to justify it somehow.

0:20:03 CT: Yes, we are. Aren't we? Yes. [laughter] Yes, indeed. Well, Elliot puts it this way, he said, "Here's how it works," he said. "You fall in love and the person is the most wonderful person. This person is just adorable. Everything about this person is a lovable thing. You are in a state of blissful consonance because everything this person does conforms to what you think the ideal partner should be. Then a tiny blip occurs on the landscape of love." Wait, did I just mix a metaphor?

0:20:34 SC: That's a pretty... Yeah, blips on landscape, that's okay. Go with it.

0:20:39 CT: Crabgrass in the lawn of life, whatever it might be. And your beloved turns out to have a habit that you really don't like so much. You like getting to places 20 minutes early, and your beloved likes to get there two days late, right? Okay. Now, at first what people do is just dismiss this dissonant information about the perfection of the beloved and say it's really not important. And most of the time it's not, but what happens next is going to be key for how couples maintain their relationship, in the sense that all of us do things that the other person is annoyed by or doesn't agree with, and then the question is whether you come to see those things as behaviors that are too dissonant to live with or whether you reduce dissonance by saying, "You know what? This person is so wonderful in so many other ways that this particular problem is trivial and I can live with this."

0:21:46 CT: But those are the negotiations we do all through our lives. You can see what happens when people have made the decision to divorce is that suddenly they have a little memory surgery, and immediately forget everything they ever once loved about the person and now everything about the person is demonic, terrible and wrong. And, "I always knew for ever, that this person was the worst possible person." Well, no, you didn't.

0:22:14 SC: A switch flips right?

0:22:15 CT: A switch flips and the switch flips because now you are seeking to be consonant in your view that this person is too damn terrible to live with any further. And so, everything then gets rewritten to conform to the new story.

0:22:30 SC: In my book, The Big Picture, I talk about similar ideas because I'm trying to talk about some kind of coherence in one's view of the world. So, I introduced the idea of a planet of belief, that you have a set of beliefs that are kind of like rocks or asteroids, that come together under a mutual gravitational pull because they seem to fit together and you have... You develop a coherent view of how to look at the world and this is your planet of belief. And these planets develop their defense mechanisms, right? If something comes along, a stray rock from outer space, that threatens to undermine the integrity of the planet, you try to blast it out of space, right? You don't try to fit it in, necessarily. And it's very interesting why...

0:23:11 CT: Cosmological dissonance here, that's what this is, yes.

0:23:12 SC: I think so. Yes, cosmological dissonance, right. It's very interesting, you might say, given these examples, "Why don't people just change their minds?" Right?

0:23:20 CT: Right.

0:23:22 SC: Is it just a matter of it's too much work? Like, "Why don't they get a new world view?" is it all about status and their self-image or does it go deeper than that?

0:23:31 CT: Well, it's both of those things. It depends what the world view is about in particular. And how important it is to you whether this is really a world view that governs how you see the world and how you behave in that world. And we all have our narratives, as the trendy word now is, our stories that govern our lives. As George Gerbner once said, "We're the only species that tells stories and lives by the stories we tell." That's really the key thing. So for example, we can see this with memories, everybody comes up with the story of their lives. I was raised this way. This is the kind of person I am, this is what happened to me that has made me what I am today. That story is often a very powerful one, because it explains to us who we are. So now I come along and say to you, "You know what, that memory you have of what your father did to you when you were seven, turns out that didn't happen at all that way." And it's jarring, it's jarring to learn that a memory you have that seems to fit the story you tell about your life isn't so, that's why we get so upset to learn that our memories are fallible and are wrong.

0:24:50 CT: So as I said earlier in Elliot's view about the dissonance that hurts the most is the dissonance that most blows up the planet, if you will, that most questions something we've lived by, whether it's religion or atheism or a life story, a belief about why we are the way we are. And moreover, where it's not just changing one idea but where one idea is going to pull out a thread that's going to unravel the whole thing.

0:25:27 SC: So these ideas of cognitive dissonance, it's slightly different, but clearly related to cognitive biases that we have, the little bugs in our reasoning capacities that motivate us to think one way or the other, motivated reasoning, my brain is not giving me any cognitive bias even off the top of my head, confirmation bias, things like that.

0:25:47 CT: Well, that's the key. Well, cognitive dissonance is based on... And I want to say, that this is a theory of mental functioning that has more than 2,000 experiments supporting it in every aspect of social and cognitive psychology, it has a lot of evidence underlying it, and one reason is that the component cognitive biases really are subsumed under it in a way. That is, cognitive dissonance rests on a number of particular biases. One is the confirmation bias, which we might call the consonance bias. I'm going to look for information that confirms what I believe and will keep me in a state of consonance and I'm going to dismiss and throw out and tell you you're a jerk if you give me evidence that's dissonant, right. So the confirmation bias is key to how we keep our thoughts consonant. There's the bias that we're not biased, that is a very good bias. I really love that bias, Lee Ross's bias.

0:26:48 SC: I don't have that bias. I'm completely unbiased about that.

0:26:52 CT: You're completely unbiased about that. I think it is the cutest bias ever, but you see, think how it plays out. You and I are having an argument. I am an unbiased person. Therefore, if I can just sit here and tell you why you're wrong and I'm right, you will understand that you are wrong and I am right and that will be fine because you are biased and I'm not, and we'll have a lovely conversation.

0:27:11 SC: If only I understood that, progress would be made.

0:27:13 CT: There you go, and the fact that you do not accept what I have told you and instead have your own beliefs, oh, unthinkable, that means you're biased and I'm not. I mean, it's a central filter of how we get through our days. And it is the reason that quarrels and arguments not only continue but escalate, because the determination to show that other person why they're wrong, why they're wrong, and you're not continues. So many of the information biases that govern us in a way can be subsumed under this overarching need to be consonant, to keep our beliefs uniform and motivating. Motivated reasoning is exactly the more recent term for this, as Elliot says, every few years somebody renames dissonance theory something else so that they can get their publication and to show why it's not cognitive dissonance at all, it's their... Well, that's the same thing.

0:28:22 SC: So in some sense, this is... We'd rather be coherent than right. How our brains work anyway. You wouldn't say that out loud, but sort of under the hood this is the machinery working to make everything smooth out, remove the conflicts wherever we can.

0:28:36 CT: Yes, and we would add that that is, evolutionarily speaking, a very adaptive strategy. You don't want to keep changing your mind four times before breakfast, you don't want to sit there saying, "Hmm, should I brush my teeth?" Although then you read a study of flossing, and you think... I'm just being silly, but if we had to change our minds with everything that came along every day we wouldn't get anything done. And so we choose what kind of package of beliefs we want to hold and for the most part, spend our energy following those than investigating every new thing that might cause us to change.

0:29:18 SC: Presumably, there is an evolutionary pressure to be correct about the world also, but it's clearly a trade-off, right? Somehow, whether it's... There might be some physics explanation for this, I'm not sure. Some Bayesian reasoning, some efficiency, some thermodynamics explanation. But coherence, like you said, somehow the coherence of the world view has been so important as we've gone through the generations that that is almost utmost when we try to make sense of the world.

0:29:45 CT: Well, that's right, and of course, so many of the things that evolved to suit us, when we lived in tribes, little bands, were effective ways of keeping us rooted to and connected to our tribe mates. And of course, the corollary of that is, that anyone who is not in our tribe was an enemy who had to be stomped out immediately. And their beliefs are definitely wrong.

0:30:13 SC: So happy we evolved past that.

0:30:15 CT: Yes, indeed. So, those were adaptive strategies for thinking and behaving, but they don't always serve us well today.

0:30:24 SC: And you've pointed out that there's almost an inverse correlation in our ability to deal with these cognitive biases with how smart somebody is, their intelligence or their level of self esteem. The better you are at reasoning, the better it turns out you are at reconciling these dissonances in a way that doesn't change your mind, in some sense.

0:30:43 CT: Oh, not reconciling dissonances, better at justifying.

0:30:46 SC: Justifying.

0:30:47 CT: Justifying yourself. Yeah, yeah, exactly. It's your most skeptical skeptics.

0:30:53 SC: The firmest entrenchment.

0:30:56 CT: Who, "Don't tell me I've been gullible." Or the people who think of themselves as being most competent in a particular sphere who are going to resist being told they put an innocent guy in prison, or they've just killed a patient, and so forth. Of course, I want to make it clear there are many, many exceptions of people who really want to improve how they do their jobs and how they live in the world, but they are not generally the majority.

0:31:24 CT: I would like to mention, because I find so many people have found this metaphor to be really helpful in their lives. So, I'll mention it now. We call it in the book "the pyramid of choice." And as I say, it's been very touching to me to find... People have written and said, "You know, this really works for me." It started because somebody did a study a long time ago of schoolchildren. They were kids, and the question is, "What's your view of cheating?"

0:32:01 CT: So imagine a pyramid, or a triangle, if we're two-dimensional. You're at the top of this pyramid and your views... You and your pal have the same views of cheating. You don't think it's a great thing to do, but it's not the worst crime in the world. Okay. You're neutral about cheating. Now, you're taking a test, it's really an important test, it's the final exam. Your grade is going to depend on this exam. We all know how students are, "If I don't pass this, everything will go south. My cat will leave me."

0:32:33 SC: I've seen this in action.

0:32:34 CT: You've seen this in action. "My world my will end." But you freeze on a crucial question, you do not know how to answer this question, your grade is going to... You're going to fail, this is horrible. And suddenly, the student next to you makes her answer abundantly clear. And so you have it in one second: Cheat or don't cheat. So in this study, some of the kids were given a situation in which they were able to cheat, and some not, so forth. And here's what they found: The second you make that decision, even impulsively, you cheat, some... You cheat because the grade is really important, you don't cheat because your integrity is more important.

0:33:09 CT: Now, the minute you make that decision, you must make your behavior and your belief about cheating consonant. So if you cheat, your attitude will now shift to be, "Cheating is really not such a bad thing. Everybody cheats, it's not important, and I'll never cheat again." If you resist the impulse to cheat, you say, "Wait a minute, cheating is not a victimless crime. We all suffer from it, it's a terrible thing, cheaters are bad."

0:33:37 CT: So, in our metaphor of this pyramid, you make that first step down the pyramid. You cheat or you don't cheat. Over time, what happens is you become more and more entrenched in your view of cheating as a good thing or a bad thing, until over time, you are both at the bottom of this pyramid standing far apart from each other in your views about cheating. The one who cheated will think it's completely trivial and not important at all, and the one who resisted will want to hang you by your ankles outside the school door.

0:34:08 SC: Having started with almost the same belief system.

0:34:11 CT: Having started... Exactly. And what's really interesting about this, when you visualize it, you can see how difficult it is to go back up and say, "Wait a minute, I was wrong to cheat." And that's why... There are many metaphors for this, the slippery slope and so forth. But by seeing it in a dissonance way, in a self-justification way, you can see how hard it becomes to go back and say, "You know that little jump, that little random thing I did, has led me far from what my original views of cheating were." So that metaphor applies to so many decisions that we make in our everyday lives from an impulsive decision. Somebody's accused of rape, or accused of some other horrible act, and what do we do as citizens reading the newspaper or the... Did I just say, reading the newspaper? Please forgive me, I'm so sorry.

0:35:09 SC: Your age is showing, I didn't want to say anything.

0:35:12 CT: Oh, God, don't want to say anything, but... Right.

0:35:13 SC: Reading the website.

0:35:18 CT: What is a newspaper? Okay. But we all...

0:35:19 SC: You mean, listening to the podcast.

0:35:20 CT: Listening to the podcast. We all make impulsive decisions: That person's guilty or that person's innocent. And then we don't want to hear, we don't want to wait for, God forbid, evidence. Oh, please, suggesting that we're right or wrong. And so we see these in today's world where everybody instantly jumps to an opinion about everything, without necessarily waiting for evidence or further information. And that's how quickly you can get enmeshed in justifying that initial jump off the pyramid. And when you turn out to be wrong, not so easy to say.

0:35:53 SC: I love the Pyramid of Choice. I think I'm in the crowd who, once you explained it to me was like, "Oh, my goodness, that just explains so many things."

0:36:01 CT: The faculty at Duke University, when that lacrosse team was accused of raping the stripper that they'd hired, and so forth. And the faculty, 80 members of the faculty took out a full-page ad saying the culture of... The rape culture of our athletic teams, and so forth, is disgraceful, duh-duh-duh-duh-duh. In other words, assuming that the allegations against the team were accurate, and when they turned out not to be and when that...

0:36:28 SC: They were false, yeah.

0:36:29 CT: Yeah, they were false, and when the district attorney was disbarred for not sharing appropriate information with the defense, was there a letter from the faculty saying, "Gee, we made a mistake, we shouldn't have been so quick to judgment, maybe we should have waited for the evidence."

0:36:45 SC: There haven't been that many letters like that in human history.

0:36:48 CT: Not so many, no.

0:36:49 SC: This is a related to what I'm sure is a cognitive bias. I've not seen it talked about in psychology or whatever, but when dealing with probabilities, human beings are very bad at estimating probabilities, but also they're very bad at accepting that probabilities go from zero to 100%. I think that when faced with something that might happen and you're not sure it's going to happen, it seems to me, people will only really deeply accept three possibilities, that the chance of this thing happening is 0%, that it's not going to happen, that it's a 100%, it's definitely going to happen, or that it's 50%, that we have no idea whether it will happen or not.

0:37:29 SC: The idea that it's 70% and that sometimes the 30% thing happens. I think that Nate Silver got in a lot of flack after Donald Trump got elected, because he gave something like a 70% chance that Hillary Clinton would get elected, and afterward he was like, "That wasn't wrong, 30% things happen all the time." But we have this inability, I think, to correctly judge probabilities that are not 50:50 or one or zero, does that make sense?

0:38:00 CT: Absolutely. Well, and it's completely true and of course, as we know, when people make decisions, medical decisions, do I want this treatment? It has a 10% chance of success and a 90% chance of failure. I mean, we are asked to make medical decisions often on the basis of likelihoods and risks. And it's very hard very often to make those decisions, or we look at the chance of something happening, rather than the chances of it not happening.

0:38:36 SC: Yep. [chuckle]

0:38:36 CT: So, that's a difficult skill to acquire, but truly an important one.

0:38:41 SC: A lot of poker players have become very wealthy off of people's inability to correctly judge these probabilities. And probably also the same... The Pyramid Of Choice also feeds into the idea of not only how we justify our own beliefs, but how we view others, our... This goes back to the couples divorcing. You were in love with this person at one point. Now you decide you don't want to be with them anymore, and they're now Satan and talking to other people about them afterward must lead these people to say, "Well, why were you ever in love with this person?" But it helped, but you change how you view the other because you're not on their side.

0:39:17 CT: That's right. Well, what you say is... We have this guy in our book who is a friend of Elliot's, who having left his wife of 37 years, said to her, said to her, "I never really loved you." Well, really? No, for the first 35 years, he did, right?

0:39:35 SC: So I just found online, this thing has just been going around, how Democrats and Republicans view each other. So, not how they view themselves, but how they think... Actually, I think the study did ask how they viewed themselves but who cares about that? Those are more or less accurate. But they ask questions, they asked Republicans what percentage of Democrats are agnostics or atheists. The correct answer is about 9% and the answer given by Republicans was 36%.

0:40:02 CT: Oh, 36%?

0:40:02 SC: 36%, yeah. There's some reasonability there. The one I liked was how many Democrats are gay or lesbian, [laughter] and 38%, the average Republican thinks over a third of Democrats are gays or lesbians and the correct answer is closer to 6%. And the same thing for blacks and union members. But to be very, very honest, the Republicans' bad ideas about Democrats were less crazy than Democrats' bad ideas about Republicans. So, how many Republicans are 65 or older, Democrats thought it was 44% of Republicans are over the age of 65, which even a small amount of thinking, [laughter] probably the numbers don't really work out that way. It's closer to 20%.

0:40:49 SC: And my favorite, how many Republicans, they also thought Republicans tended to be evangelical Southerners, but they also thought the Republicans earned over $250,000 a year. The average Democrat thinks that 44% of Republicans earn more than $250,000 a year, and the real number is 2%, not 44%. And partly misinformation, but it has to be partly part of the cognitive dissonance story. We tell a caricaturing story of the people we disagree with that makes us look better. You're not sure?

0:41:30 CT: Yes. Well, yes. I mean, you've packed a lot into that. Those answers stem from stereotypes that we have of another group, a group that we don't know much about. I remember reading a story about some New Yorker who had moved to Idaho and everybody in Idaho thought he was... Well, you can imagine what they thought of him, sort of a crazy liberal, Jewish, communist, pinko, gay guy. I don't know. But the people in Idaho had no experience with New Yorkers. They knew nothing about New Yorkers. And he goes to Idaho with plenty of assumptions and expectations about people in Idaho, they're all gun-toting idiots or whatever the hell he thought they were. And what happens when you have people living in the same community who have different opinions, is that you see diversity in attitudes and behavior amongst them. My good friend and co-author Carole Wade, who lives in Northern California in a very gun-toting conservative area, but she said, "We disagree politically, but we see each other as the real human beings that they are. I see Republicans I disagree with, but they are caring members of their community and they are active in feeding the poor and the hungry and they would stop by the side of the road if you were in any kind of trouble to help you."

0:42:57 CT: And so, it's not stereotype thinking. So, asking questions like this of Democrats and Republicans is asking about a stereotype, or something about that group that you don't like, they're rich, they're gay, they're crazy, whatever it might be.

0:43:12 SC: I was just amused that Democrats thought that so many Republicans were at once really rich and also evangelical Southerners. Quite sure this is all put together.

0:43:20 CT: Logic and coherence do not enter into stereotyping, you should know that by now.

0:43:24 SC: But yeah, and I think that maybe, does this help also explain... I don't even want to open the Pandora's box, but social media and how vitriolic things are and how we respond viscerally and badly to the slightest amount of disagreement online because we don't appreciate the humanity of the people behind the words. It's not a face-to-face conversation. All you're seeing is this one little snippet of belief or opinion and you can react instantly and badly to it in a kind of safe way.

0:43:57 CT: Nobody counts to 10, let alone 150. The first book I wrote, Anger: The Misunderstood Emotion, a very long time ago, was an examination in part of the catharsis hypothesis, the idea that it's really good and healthy to express your anger when you feel angry and it will reduce your heart rate, then it will... It'll prevent you from getting ulcers and all kinds of other good things, wrong. This was my first experience with explaining kindly to people why these clinical assumptions and Freudian notions about catharsis not only were wrong, but harmful, because the more people express anger, the angrier they get, generally speaking. And it doesn't calm you down, it riles you up and so forth. And in that book...

0:44:40 SC: And everyone instantly agreed because you had evidence?

0:44:43 CT: Yes, I had wonderfully interesting evidence and, of course, as you can see, thanks to the diminution of anger in our society that everybody...

0:44:51 SC: That was your fault, thanks. Good.

0:44:53 CT: So, no, they told me what I could do with my ideas about catharsis. One guy wrote to me and said, "When my wife dents the car, I want to yell at her. I want to yell at her. I want her to feel even worse than she does."

0:45:09 SC: Therefore...

0:45:11 CT: Therefore... Really?

0:45:11 SC: Obviously, the right strategy is to yell at somebody.

0:45:13 CT: Oh, please. So, one researcher had identified the conditions under which the expression of anger is likely to be cathartic, that is to make you feel good. The conditions? Okay. The first is no retaliation from the target. So you say to your sweetie.

0:45:35 SC: I agree with that one.

0:45:36 CT: "You are a toad." And the target says, "You're right. I really am a toad." So if there's no retaliation, that's good. You are anonymous, so the target can't respond to you. Your anger is proportional to the offense. So if somebody does something trivial and you hit them with a howitzer, you're going to be embarrassed, it's sort of overkill. But if you feel your response is the appropriate, righteous thing, you're going to feel good.

0:46:12 CT: So, that's the gist of these conditions, all of which are met by the internet. You get to just rant immediately. You get to zap somebody, you get to be self-righteously satisfied, "I showed that person what's what." And once again, people just failed to take my advice about this. I remember writing a letter to a columnist, Russell Baker, who was a funny guy, and he wrote something that I disliked about the women's movement. I don't even remember what it was. And I wrote him an angry letter. I was terrific, let me tell you. I was funny and smart and stupid, because you know how you feel when you're writing something in anger.

0:46:56 CT: So I sent him this letter, and he sent it back to me edited. He had written in the margins and commented and basically, of course, I get this thing back and I realize how really rude I'd been and how I was not going to persuade him of anything in my rudeness. And so then I replied, "I'm sorry, this is not what I meant to say. Let's try again." I sent him that one and he wrote back and said, "I think this could be the beginning of a lovely and civilized correspondence." It was a great lesson for me because with anger, there's two reasons to be enraged, one is you want to punish the other person, and the other is you'd like to change the other person or fix the problem between you. And a lot of people confuse those motives, I think.

0:47:45 SC: And this brings us to the question of, what could we learn from cognitive dissonance theory about how to actually change people's minds, if that were to be our goal? I think as you're implying, sometimes that's not our goal. Sometimes we just want to let loose a little bit, right? And maybe we should squelch that impulse, but sometimes we want to reach people, knowing that people are defensive about their existing views, and knowing that they want to justify themselves. Does that give us a clue as to how to be better persuaders?

0:48:12 CT: It gives us a very important clue, one that's not always successful. Of course, we can do better with our own views, perhaps, than trying to change the other guys. But the rule, the guiding rule, number one, is, do not make the other person feel stupid for what they believe. And this is, of course, what people do. "What's the matter with you? What were you thinking when you did that? How can you spend $5,000 on magazine subscriptions? You know what? Are you an idiot?" Well, what is the person going to say? You have now exquisitely put them in a terrible state of dissonance between, "I'm a smart person, I spent this money on magazine subscriptions because I'm hoping I'll win the lottery and my children will have money forever. My motive was a good one."

0:48:56 CT: So, when you make people feel stupid for believing what they believe, you guarantee that they will hold on to that belief even more strongly, you've just guaranteed it. So, instead, you look to understand or talk to the person about why they believe what they do, what that belief serves for them, how it helps them, whether they themselves see any problems with it, whether they see a better way of getting what it is they want, so that you don't make it an adversarial I'm right and you're wrong, because that one's going to go nowhere. For me...

0:49:31 SC: Anthony Penn, who is another podcast guest, talks about giving people a soft landing. He talks about why certain communities have different beliefs that they do, and he says, "Well, all you're telling them is that they're idiots, and that they're irrational and they don't understand reason and evidence. And no one has ever been really persuaded by that."

0:49:49 CT: That's exactly right. He's put it exactly in dissonance terms, if you're going to make me feel stupid, I'm out of here. And by the way, it may be that the accuser, we, the smart questioner, really don't understand what that belief is serving for the other person, what they gain by believing as they do, how it comforts them, how they've lived by it, how much investment of time and emotional energy they've put into that belief. One of Elliot's classic experiments, Severity of Initiation on Liking for the Group. It's a famous funny study, but basically what he found is that the more work and pain and effort you put into joining a group, the more you're going to like that group even if it turns out to be really a stupid and uninteresting group. So, all of us have beliefs that we have spent a lot of time, and sometimes pain as well, in investing.

0:50:50 CT: I think for ourselves in understanding dissonance, the story we tell in our book, which I just love, because I think it's a... It's such a good guideline for us all, is... This is a true story. When Ronald Reagan went to Bitburg, Germany to lay a wreath, there at the cemetery, where it turned out that 47 Nazi officers had been buried, there was an enormous protest over this. He's going to this cemetery and honoring these Nazi war dead? And Reagan's good friend, Shimon Peres, then Prime Minister of Israel, was, of course, particularly outraged, so were many others around the world, but Reagan didn't back down.

0:51:33 CT: So, someone asked Peres, "How do you feel about the fact that your good friend, Ronald Reagan, went to Bitburg to do this ceremony?" And Peres said, "When a friend makes a mistake, the friend remains a friend and the mistake remains a mistake." That is the goal in how we ought to think about cognitive dissonance, because the usual impulse, friend has made a mistake, is to drop the friend or minimize the mistake. You can see this in news reports all the time. Somebody we really think as a fabulous person has been convicted of some crime or accused of something and we... People don't know which way to go.

0:52:25 SC: I know that guy, he's been so sweet. Doesn't sound like him at all.

0:52:29 CT: Exactly. Or the mistake, please, what he did. Oh, it's just trivial.

0:52:35 SC: It's a misunderstanding.

0:52:38 CT: 43 children, really?

0:52:41 CT: So, that's the usual thing that we do, and we usually do that unconsciously. We make those reductions of dissonance under the radar to allow ourselves to feel comfortable, because living with dissonance is painful. I want to go back to that point. It's really uncomfortable. Sarah Silverman has a fantastic three-minute video that she did when her good close friend Louis CK was discovered to have been committing these awful sexual acts, and her video is a perfect statement of a person in dissonance. She says, "What he did was wrong. It disgusts me, and I hate it, but he's been my very good friend for many years. I know what a great humanitarian he is, what a great father he is, what a great friend he's been." So, she's able to separate these two things, and she said, "I don't know how I'm going to resolve this," but that's the struggle that's worth doing rather than throwing him under the bus or minimizing his actions. It's harder, it's harder.

0:53:49 SC: It seems to me like empathy is important here. I know Paul Bloom, the psychologist, who I'm hoping to get as a future podcast guest, he wrote a book. He had a brilliant idea. Let's write a book called Against Empathy, empathy is bad. And very, very roughly speaking, his idea is that having empathy tends to be something that we do for people who are like us, and therefore too much empathy pushes us away from the direction of reason and rationality and thoughtfulness, because it just makes us instantly identify with other people. But it sounds like if we're really trying to understand people who seem to be doing something we don't understand ourselves, putting ourselves in their place, understanding where they came from, like you said, understanding the reasons why those beliefs are so valuable to them, would be useful to us.

0:54:37 CT: Well, it is... The question is the consequences of understanding, you know that the French say, "To understand all is to forgive all?" No, it isn't.

0:54:46 SC: No.

0:54:47 CT: To understand all is to understand all, period. That's all it is. And then once you understand, then you can decide, "Do I want to forgive? Is this important to forgive? Does my understanding give me greater compassion for why this person has behaved as they have and make me more forgiving of it or not?" That's a separate question. And that's a moral issue, a moral choice for each of us, but we all live by the attributions we make of other people's behavior. This is another big part of social psychology, which is how the explanations we make of why people do what they do affects everything. So if you think your beloved has been rude to you or thoughtless because your beloved has had a horrible day at work and in Los Angeles a 19-hour commute home, okay, you'll be more forgiving than if your attribution is to their personality, they're just the way they are.

0:55:52 SC: And even talking to people in public, if we want to say, "This person disagrees with me, is it because they have a principled objection to what I'm saying, or just because they have bad motives and are not acting in good faith and clearly don't understand the force of my reasoning?"

0:56:06 CT: Exactly. Well, it's why the goal really of what it means to be open-minded is to make ourselves open to other people's points of view, without the... The dissonance reduction reaction is, "Get that other point of view out of here. I don't want it in the room." But when you can, if you have to make important decisions, if you can surround yourself with people who aren't yea-sayers, who aren't just going to agree with you, but who might show you where you might be wrong, that's extremely useful.

0:56:39 SC: And then we have the idea of persuading others, we also have the idea of fixing ourselves, right? I've been in the crowd when you've given talks to audiences, at TAM, for example, and it's amazing, you explain how cognitive dissonance makes people justify their mistakes and things like that, and everyone in the audience is going, "Yes, the people I disagree with do justify the mistakes that way." Right?

0:57:04 CT: Yeah, they do.

0:57:04 SC: And almost no one says, "Maybe I am justifying my mistakes." Like what... Is there a tool kit or a checklist or some set of strategies we can have to notice when we're doing this, when we're doing this self-justifying behavior?

0:57:19 CT: Sure, you notice that you're justifying your behavior, it's a really simple tool...

0:57:23 SC: So awareness? I mean...

0:57:25 CT: Well, that's...

0:57:26 SC: But we're so good, right? At coming up with the excuse. So when is it an excuse, and when is it just a correct rational reconstruction of why we're thinking in such a way?

0:57:34 CT: Oh, well, that's a very fine question. One of the things that's been interesting to read, the Amazon reviews of our book is how many people say, "Well, I started out reading this book and I was, 'This explains all my friends.' Then halfway through, 'Oh, this explains the government too.' And then at the end it's, 'Oh, it explains me.'"

0:57:54 CT: I see science in general, as a form of arrogance control, in the sense that it's one of the most organized methods we have of forcing us to put our beliefs to the test and forcing us to face dissonance. If the test does not confirm what we believe, that's the method we have set up to test our beliefs. And you don't have to do a scientific experiment to do that, to take some belief that you hold, and say, "Well, really what is the evidence for this? Is there... Is the evidence changing what I once thought to be true?" I look back on my writing in social psychology and I'm... It's interesting to me how many ideas have changed irritatingly because the evidence has changed or has become more persuasive.

0:58:54 SC: I do, I personally agree that acting as an informal scientist is a good way to correct yourself against some of these biases. Like you say, much of the scientific method, informal and crazy and non-systematic as it is, is about putting off the possibility that we will fool ourselves one way or the other. I'm always very amused as a cosmologist, as a physicist, if I say something about the Big Bang or Einstein or relativity, there's some fraction of people out there in the world who will say, "Oh, you're just defending the establishment views of the cosmos." And I want to say, "All of my incentives are in the other direction," like, "You don't become a famous physicist by saying that Einstein was right."

0:59:38 SC: Everyone knows Einstein was right. The whole point, the whole incentive structure in science is to say, "The great people of the past were wrong and we can do better than them." And that's a wonderful sort of self-corrective to try to do. Now, on the other hand, at the same time, I realize I'm telling a flattering story about myself, because I'm saying that scientists are the best at correcting their false impressions, 'cause they have their scientific training in them.

1:00:02 CT: Well, no, scientists are human like everybody else, I think. Is that true? I mean, aren't you guys pretty...

1:00:08 SC: Very flawed human beings, most of us.

1:00:09 CT: Very flawed. [laughter]

1:00:10 SC: There's a couple of outliers, but yes.

1:00:12 CT: Well, so even scientists are not always happy that the results... When results don't come out the way they would like them to. I mean, we all... They'll say, "Well, that... I guess I didn't do that study well enough. I'd better redo that another 45 times 'til it comes out right." The ideal scientist, we might all aspire to, but everybody knows the feeling of thinking, "How can I re-do this so it will come out the way I want it to come out?"

1:00:46 SC: Yeah, and certainly empirically, there's a million examples of people doing exactly that, right?

1:00:49 CT: Oh, yeah, yeah. Yeah. But indeed, science is self-correcting, or at least it moves forward in certain lurching ways.

1:00:58 SC: Fits and starts.

1:01:00 CT: In fits and starts. And you're right, people do have an incentive to find something new and different to say. In my field, it's always so interesting to me that all the studies that show that similarities between women and men, and personality traits or abilities, and skills and stuff, those never get published. It's the differences that are sexy. We want to know what differences there are and the news that there aren't differences isn't news, and it isn't interesting. But the goal is to feel free enough to investigate questions wherever they might lead. And one of the problems, of course, we're facing in our polarized society is that it's getting harder for people to do that. To question received wisdom, certainly in the social sciences. Oh, my God, to ask provocative questions, to be a nay-sayer, to...

1:01:52 SC: Right. You mentioned the brain. We've been talking about...

1:01:55 CT: Did I mention the brain? Oh.

1:01:55 SC: I think you mentioned the brain, I think you just did. Maybe I projected it onto you, I don't know. We've been talking in psychological terms, of course. There's a sexy thing going on right now, which is the neuro-scientific way of talking about these things; actually looking in the brain, not just talking to the person, but doing fMRI or MEG studies of what's going on inside their head. Do these concepts of dissonance and justification map on to parts of the brain? Is that something that we aspire to, or we can say now, or is just crazy?

1:02:26 CT: Yes and no. Yes, a number of researchers have tracked cognitive dissonance into the brain. They found out which parts of the brain are...

1:02:36 SC: Lighting up?

1:02:37 CT: Lighting up. I love that language. Lighting up. It's an artificial light up, but okay. When the brain is in a state of dissonance. One of the cutest experiments was to have Democrats and Republicans process information about their favorite political candidate; positive information about their favorite candidate, positive information about the opponent. Oh, that's dissonant.

1:02:57 SC: That's bad.

1:03:00 CT: Bad information about your guy. And when they were in a state of dissonance, "My guy did something bad, the other guy did something good," you could see it in their brains. And he said... They said it was like when consonance was restored, it was like twirling a mental kaleidoscope to get the pieces to fit right. People really were happy when the pieces fit right. That said, I don't know what we've learned from this that we didn't know otherwise. And I'm very interested in brain research, of course, but I've also written about pseudo-neuroscience, which is the notion that if we just have a brain in there, we're really doing science. And very often a lot of those studies are just, somebody's got a scanner and I'm going to do I what I can with it.

1:03:47 SC: I love the term "neuro-bollocks" for...

1:03:52 CT: There you go.

1:03:53 SC: The temptation to find things going on in the fMRI image.

1:03:56 CT: Absolutely.

1:03:57 SC: And saying, "Oh, yes, there it is."

1:03:57 CT: There it is.

1:03:58 SC: "That's the part of the brain that likes rutabagas, right there."

1:03:58 CT: And what do you know now that you didn't know before?

1:04:01 SC: Now, of course, neuroscience is very important but it's also hard to do right. And maybe it's a little bit easy to over claim what you're getting.

1:04:08 CT: Oh, yes, of course it is. Of course it is. And it's really like all new technologies. Give a boy a hammer and everything needs pounding. Give researchers a scanner and everything needs scanning, but what am I looking for?

1:04:20 SC: And it's very early days. We don't... We're pre-Galileo, we're trying to collect data, understand the... The brain is way more complicated than the solar system ever was.

1:04:28 CT: Well, there you are. Absolutely.

1:04:29 SC: But you also mentioned or talked about the idea of memories and how memories aren't always accurate. And this is something that I first came to from neuroscience, from studies. Actually from I was studying the arrow of time, the difference between the past and the future. And there were these wonderful studies, and I don't pretend to be able to judge the scientific credibility of various neuroscience studies. Maybe it is neuro-bollocks, but here's what the study said. That if you put someone in an fMRI machine, so you're basically taking pictures of where the blood is zooming around in their brain, and you ask them to imagine a past experience, remember what it was like at your birthday last year, where you were having dinner, whatever. Certain parts of the brain light up in excitement 'cause you're turning around memory, remembering this scene. And then you ask them to imagine a future scene. Imagine what your next birthday will be like and you're somewhere else and so forth. The claim was it's exactly the same parts of the brain that are doing the same thing.

1:05:33 SC: Of course, they're doing something different, but evolution is sort of cheap. Evolution reuses things that are already there. Maybe the hypothesis goes, the way that we learned to imagine the future is by taking advantage of the fact that we had a part of the brain that could remember the past, and just putting it to use again. And the idea being that the memory of the past is not a videotape, it's not a recording that is stored as a JPEG file or an MPEG file somewhere. It's more like a screenplay. And when we remember the past, we download the screenplay and put the play on again, just like we do when we remember the future. When we imagine the future.

1:06:13 CT: No, no, it's more like...

1:06:15 SC: No?

1:06:15 CT: It's more like cells of a film, where you have a couple of those cells of a film, and you are interpolating the connecting elements in a strip of film, in a filmstrip. Which I didn't mean to interrupt you, but I would say something a little different, which is when you're asking them to remember a past, a childhood birthday or imagine a future one, you're imagining both of those things.

1:06:41 SC: Right.

1:06:42 CT: You don't know that the recall of the birthdays is accurate. What you're asking for is, "Tell me a story about what that birthday is that you think you remember." You're not actively or accurately retrieving every detail about that fifth birthday party. You have a few details about it. You've seen pictures of it. Your mother has told you what the color of the cake was, or whatever it might be. Mostly you've seen pictures of it and heard stories about it. And so, you've made up a little memory of your fifth birthday party that may or may not be true. And by the way, every time you talk about your fifth birthday party, you're going to embellish it. And with every embellishment, your memory of it will change.

1:07:19 SC: And you reinforce things that might not have been there.

1:07:21 CT: You enforce the newer story. Yes, exactly. It's why it's so dissonant for people to accept the evidence that memory is fallible in these ways. And it's... We keep our memories consonant with whom we think we are now. My brother... I had an older half brother who had a terrible stepfather who was really pretty mean to him, and Bill's memory of his stepfather was how mean his stepfather was to him all the time, and he could tell you these stories. But years later, when he confronted his stepfather about it... The stepfather was a disciplinarian and cold. The stepfather said, "But you were a difficult kid. You didn't listen. You didn't want to talk to me about reasons for things. I had to be a strict disciplinarian with you because otherwise you'd go head first out the window, and get into more trouble." My brother had written out of his memories his own part in the story. And that's a common thing that we do. So in that way, he created a story about how his stepfather was mean to him and made him do this, that and the other thing. So past and future is a lovely arrow, but what we know about memory is... Well, with all the research on how unreliable memory is, I think it's important to say it's also pretty damn good, too.

1:08:50 SC: Pretty damn good. Right. Overall.

1:08:52 CT: Overall. Overall, I'm glad I have one.

1:08:53 SC: Not completely random.

1:08:54 CT: I'm losing it rapidly, and so I'm glad what remains. I'm happy for it, but that is a hard thing for us, I think, to understand that it's more efficient for the brain to jettison elements of an event or a memory that we really don't need to have, and to give us the gist, the centerpiece of it. But it's also what allows us to rethink events that happened, and to reconsider their causes or other things that were happening at the time.

1:09:25 SC: And it was interesting to me to realize that utterly false memories can be just as vivid and just as absolutely convincing to the rememberer as accurate ones. So I guess...

1:09:40 CT: Oh, people hate that. People hate that.

1:09:42 SC: Right. But it's wrong to have the metaphor in your mind of a photograph that is aging and getting blurred out over time. You could have a very crisp, very vivid, very believable memory that is 100% false.

1:09:56 CT: Exactly.

1:09:58 SC: And so that... The brain is doing something crazy there. It's not just a... That's what I mean. It's not the JPEG image in your brain. You're somehow sharpening up. And as we said, telling the story over and over again, I think this is going to be part of my chocolate immersion therapy. [laughter] That telling yourself a story you want to hear, just keep repeating it. Tell yourself this version of the conversation you just had in which you come out victorious, pretty soon you'll forget it wasn't the version that actually happened. I think that's very effective therapy.

1:10:23 CT: I think it's a very effective therapy. The French have an expression, "l'esprit de l'escalier." The wit of the staircase, which is the witty deft retort that you think of making as you're leaving the party and walking downstairs. That's exactly... We're going to recreate that memory to have... Be it something you said at the time.

1:10:43 SC: I live in fear of this, and I live in fear that I'm going to have... Be regretful that I didn't ask you a whole bunch of more questions. But Carol Tavris, thank you very much for being on this podcast.

1:10:52 CT: Been a pleasure. Thanks, Sean.

6 thoughts on “Episode 1: Carol Tavris on Mistakes, Justification, and Cognitive Dissonance”

  1. Evangelicalism is a lot like a “doomsday cult”. They do not predict a specific day for the end of the world, but predict that the end is coming soon nonetheless.

  2. Thank you for finally getting a podcast! In the middle of ‘The Big Picture’ and I have to say, your views are the ones I most agree with and could go on and on about the issues in arguments on the current public intellectuals that you resolve beautifully with poetic naturalism and other lines of thought. Thanks!

  3. Pingback: Sean Carroll speaks with Carol Tavris about Mistakes, Justification, and Cognitive Dissonance | 3 Quarks Daily

  4. There are some serious flaws in the reasoning. She’s a specialist, so I don’t doubt her understanding of this is based on evidence. Still, if your test conditions are incorrect, good evidence will still lead to faulty conclusions.

    If all she said is true, why is it that D. Trump won the election by doing the exact opposite of what she just recommended? At any point in history, if you follow the history of elections, you’ll notice that nobody doing what she recommends has ever won elections using such a strategy. If this would have worked in practice, in actual debates between strangers, we might have competent people in office and fact-based policies. I’m sorry to say, but we don’t live in this type of world. I won’t pretend to know the answer to why this is, but this should at least raise some warning flags. My assumption is that this theory does indeed work when applied by people with a higher level of education and if certain other conditions are met, but it’s not a very useful idea when applied to most of the population.

  5. In response to Adrians, psychology is a social science, not a natural science. As such, while it follows the scientific method it cannot draw empirically falsifiable conclusions. Therefore the evidence might be good, and your conclusion might be sound, but it still wont be able to account for every individual, nor every situation.

Comments are closed.

Scroll to Top