250 | Brendan Nyhan on Navigating The Information Ecosystem

The modern world inundates us with both information and misinformation. What are the forces that conspire to make misinformation so prevalent? Can we combat the flow of misinformation, perhaps by legal restrictions? Would that even be a good idea? How can individuals help distinguish between true and false claims as they come in? What are the biases that we are all subject to? I talk to political scientist Brendan Nyhan about how information and misinformation spread, and what we can do as individuals and as a society to increase the amount of truth we all believe.

Brendan Nyhan

Support Mindscape on Patreon.

Brendan Nyhan received his Ph.D. in political science from Duke University. He is currently James O. Freedman professor of government at Dartmouth College. Among his awards are an Emerging Scholar award from the American Political Science Association, a Guggenheim Fellowship, and election to the American Academy of Arts and Sciences.

0:00:00.5 Sean Carroll: Hello everyone, welcome to the Mindscape Podcast. I'm your host, Sean Carroll. If you're like me, you know some people, maybe friends or colleagues who believe untrue things. Their beliefs are false, incorrect for some reason. Don't you hate it when that happens? It may even be true. This is a wilder idea, but your friends might think that you have some untrue beliefs, that's even more annoying. Why does this happen? Why do people believe different things, even when they're quite educated about them? You might think that if people just didn't know that much about something, they might be uncertain in their beliefs and be corrected when they get more information, but that's not what we see. Especially in the social sphere, the political sphere, culture war kinds of questions, people believe things despite the fact that there's a whole bunch of people who believe other things and are trying hard to convince them. And we have a special problem these days with the media landscape, with technology, we are flooded with information, with opinions, with attempts to change our minds much more than ever before. So there's actually two questions going on here. One is, what is the information or the sets of claims to which we are being subjected.

0:01:16.5 SC: Are we in filter bubbles, is the news media trying to be accurate or are we just hearing things that are tribal or politically slanted in some way? So how do we control the relationship between our attention and the information we get? And as a society, how should we try to make sure that the information being given out is relatively accurate or safe or whatever you wanted it to be? The other question is then, what do we do with that information that we get? What kinds of information change our minds? We like to imagine a kind of Bayesian utopia where we have propositions that we had signed credences to and new evidence comes in, we update our credences, but people very rarely work that way. We've known that here on Mindscape since episode one. Our very first episode with social psychologist Carol Tavris, who talks about how people in groups that have some false belief together can absolutely maintain that false belief in the face of enormously strong, contrary evidence. So today's conversation is with Brendan Nyhan, who is a political scientist who's done a lot of work on both sides of this question, what is the kinds of information and influences that we get, and then what do we individually do with them?

0:02:39.4 SC: Some of his early work was on the backfire effect, the idea that under certain circumstances when you hear evidence that contradicts a belief that you have, you can come away holding on to that belief even more strongly. Now, it turns out to nobody's surprise, if you know anything about psychology, it's more nuanced than that. That's not always true. It's true, in some cases it's not true, in others, it's an ongoing thing. But it's fascinating how whatever we are as human beings, we are not entirely 100% rational. We have our biases, we have our desires to fit things together. When I wrote The Big Picture, I talked about planets of belief, the idea that we have a particular belief and a particular proposition independent from everything else is just nonsense. Our beliefs fit together sometimes coherently and consistently, sometimes less so, but that other set of beliefs that we have everything else that we're holding on to has an enormous influence on how we judge each individual thing that we learn. So with Brendan, we're gonna talk about the information sphere that we have out there in the media right now. How well the mainstream media does in trying to be objective.

0:03:54.5 SC: Should it try to be objective? When should it to try to call out lies versus just saying what both sides believe? And then a little bit about how we individually process that information and try to personally come up with true beliefs. How do people end up denying climate change or vaccines or something like that? These are questions that matter right now to the world we live in, and I don't think we got the once and final answers here, but I think it's something we really need to be thinking about very carefully. So let's go.

[music]

0:04:42.9 SC: Brendan Nyhan, welcome to the Mindscape Podcast.

0:04:45.4 Brendan Nyhan: Great to be here.

0:04:46.5 SC: So I wanted to start thinking about disinformation, misinformation, fake news, all these things, put us in historical context. 'Cause I always worry that we think we're unique right now, but I know that there was fake news back in the day, there was yellow journalism, there was snake oil salesman etcetera. Are things really different right now?

0:05:06.5 BN: Well, I share that concern that we're too quick to jump to historical differences that we think exists. That conspiracy theories are worse, that misinformation is worse, and we don't have any strong scientific basis to draw those sorts of conclusions. I think it's fair to say that misinformation and conspiracy theories have been with us as long as human beings have existed. And if you spend any time looking at history, you'll see misinformation playing an important role. If you look around the world, you'll see the United States is hardly unique and the role of misinformation and conspiracy theories here. So I would reserve judgment. There are aspects of how misinformation, conspiracy theories work now that may be different and it may be particular reason for concern. But the idea that misinformation and conspiracy theories are more pervasive now or more widely believed, we just don't know. One challenge is, of course, how you would measure these quantities.

0:06:08.8 SC: That was my next question. [laughter]

0:06:10.8 BN: We only have modern survey research in the post-World War II period for the most part, and the pulling on these questions is heavily concentrated in the last few decades, and really only the last 10-15 for the most part. So we simply don't have a great deal of empirical data on belief, similarly, the way that information spreads, of course, differs over time in ways that are hard to systematically measure and track. Word of mouth, of course, played, still plays an important role, but played a much more important role in prior eras and that's not a form of transmission that's legible to us as scholars. What gets written down, of course, is a subset of what's most important in any time. We now live in this period when digital data are available to us, and that allows for really exciting opportunities to measure what's being spread that we might otherwise not have captured, but we should be important not to... What's the expression? Not to mistake the map for the terrain. What's measurable to us is not necessarily ground truth, especially as you move through different historical eras.

0:07:25.4 SC: There's certainly the technology that has enabled us to misinform each other at a remarkable rate. We have... I guess back in the day, we had more newspapers, pamphlets and things like that, but my guess is it can't compare to our social media feeds, the number of channels on TV and so forth.

0:07:45.5 BN: Yeah, I think it's important to remind people of all the different kinds of media we've had and that we've... Newspapers, for instance, we think of as a kind of boring, standardized format, rarely takes risks, retreats to a kind of both sides journalism and so forth. But prior to the 20th century, newspapers were absolutely wild.

0:08:10.0 SC: Yeah. [laughter]

0:08:10.3 BN: Frequently printing scurrilous claims about political opponents heavily partisan and so forth. They were often a channel for the worst sorts of information that was being spread at the time. And what we think of now as journalism essentially didn't exist. The profession sort of codifies itself informally and develops professional norms and so forth in the 20th century, in the early 20th century.

0:08:37.4 SC: Okay.

0:08:37.8 BN: And prior to that, you could print what you wanted to print and there were not necessarily professional standards governing the evidence you had to marshall to make those sorts of claims. So that's just one example. The list goes on. And in each case, this is something important, when these new technologies became available, people became very concerned about them and about their harmful consequences for society and the ways they could be used to spread misinformation or to propagandize people, etcetera. So we've had historically panics over the written word, literally the printing press on to radio, television, and now of course, the internet and social media.

0:09:22.4 BN: Again, that doesn't mean there's not reason for concern about our current technological configuration, but I think it should make us be a little bit more circumspect about thinking that what we're seeing now is going to be the downfall of democracy. Because similarly-situated people have come to similar conclusions about prior new media of the time, and we now think of those in a very different way.

0:09:47.7 SC: So it's an interesting...

0:09:49.8 BN: Future Americans, future human beings may look similarly at us now.

0:09:54.5 SC: But it's an interesting point because if we even allow ourselves to be open to the idea that different historical eras were different, we tend to think of monotonic kind of change. And maybe from what you're saying, there's a sense in which it used to be just a sort of crazy and polarized as it is now, and then there was a brief interregnum where the power of the media was sufficiently concentrated and regularized and professionalized that that impression went away and now we're just returning to that previous state.

0:10:28.0 BN: Yeah. No, I think it's important to make clear to people in the way you're suggesting just how abnormal historically the period around the middle of the 20th century was, in the United States in particular. So I'm an American politics specialist, so I'll speak to the area that I know best. Historically, our politics have been heavily polarized. That was true prior to this period in the mid-20th century, and it's been true afterwards as we've had this increase in polarization. What's abnormal was this period that we think of as the way things were and should be.

0:11:07.0 SC: Yeah.

0:11:07.9 BN: Which in the mid-20th century, for reasons related to our country's awful history on race, the parties were not clearly ideologically divided. And we had... Due to accidents of the way technology, media technology and communication worked, we had a kind of establishment consolidation around a limited set of communication tools. We had newspapers, there were economies of scale that rewarded very large newspapers. And to achieve the kind of readership they needed to capture those economies of scale, they needed very large audiences. And so that meant that the partisan model of the prior period was supplanted by this kind of neutral journalistic style that allowed both people who supported either party to read the newspaper. So we have these consolidation of these very large newspapers, and then we have radio intelligent where there's a limited set of spectrum and regulation by the government and some informal pressure to maintain higher new standards than the market might otherwise support.

0:12:21.4 BN: So you have this unusual media configuration, where we go from pamphlets and newspapers, lots of a chaotic information landscape to this more consolidated establishment media. You go from a wildly partisan national politics to this less polarized one as the parties become quite divided internally on the issue of race and we get this abnormal period in our politics.

0:12:50.3 SC: Yeah.

0:12:50.6 BN: And many of the norms and institutions that we think of as foundational to American democracy are really constructions of this period. And what we're struggling with now is how we create a multi-racial, multi-ethnic democracy when those conditions no longer apply. And I just wanna be very clear about the trade-offs here, that there's no lost utopia here.

0:13:17.5 SC: Sure.

0:13:17.7 BN: It was a very narrow range of voices that were allowed through the media at that time. It was a very narrow set of people who were represented. And the reasons that the country... The country was less polarized because the parties were essentially conspiring to keep race off the national agenda and preserve a system of racial apartheid in the South to maintain the political and civic peace, so to speak at the cost of the freedom of millions of our fellow citizens. So none of this was acceptable, none of this was sustainable. We're not going... The conditions that generated those circumstances are not coming back nor should they, and we have to figure out how to move forward. And I just wanna give that brief digression because I think it's really important sometimes to avoid the sense of this kind of golden age nostalgia that creeps in when we think about our politics. There are very significant challenges associated with polarization and changes in communication technology, but we're not going to be able to wind back the clock, nor should we. And I think that's a really important starting point for any conversation about how we move forward.

0:14:31.5 SC: Yeah, no, very much. And I think it's okay to say, "Look, there was something good about that era and there were also very bad things about that era and we're big enough to accept both things at once." And now our era has changed a little bit, whether it's cable TV or the internet and so forth. Those economies of scale that led to consolidation or less dramatic, I suppose it's easy for me to just start a website, start a newsletter, start a YouTube video feed or whatever. And we seem to be flooded with misinformation, disinformation, fake news. Is it worth distinguishing between misinformation and disinformation [chuckle] and things like that?

0:15:13.4 BN: Well, a lot of ink has been spilled on these questions in my world of scholars who study misinformation. I will say I don't find the distinction especially helpful most of the time.

0:15:24.2 SC: Okay.

0:15:24.7 BN: The reason is, when people construct these typologies, they often, that define disinformation as information that is intentionally... That is spread by people who know it to be false and spread in a malicious manner. We very rarely have access to people's interior motives or so-called true beliefs to the extent those things even exist. And so it's very difficult to pin down when someone is making a claim they know to be false. Donald Trump says literally tens of thousands of false things, how many of them does he believe to be true? I don't know.

0:16:04.8 SC: Yeah.

0:16:05.9 BN: So occasionally we can say something like a Russian disinformation operation or something where there's a very well-identified actor, and we have a kind of ground truth on the construction of the false information itself, so we know people know that the information is false because they themselves constructed it. But in the absence of that, those unusual circumstances, I don't find the distinction especially helpful. It often leads to debates about motives that aren't very useful to me. To some extent whether Donald Trump knows or not that any given statement he makes is false is immaterial, we should hold him accountable for being responsible as a public figure in making accurate statements. Everyone misspeaks sometimes. I'm sure I will make errors of fact in this podcast. [laughter] But the pattern of repeatedly making false statements, doing so after being corrected and making these kinds of claims that are reckless and inflammatory, I think we can hold them accountable. Whether or not he means to do it or not, at some point, a line must be drawn. And I find it more important to focus on those kinds of questions and whether someone knows a claim to be true or not.

0:17:27.4 SC: And I guess there's a utopian vision in which if we have enough communication channels, then you just can't get away with misinforming people because some other communication channel is gonna point out that you are not telling the truth. And that mechanism seems to be of less strength or less effectiveness than maybe we would hope. Is that impression off base?

0:17:51.9 BN: No, I think we have to worry a lot about the incentives that people face to make accurate statements, especially, but not exclusively political elites. When you think about what effects public opinion, when is misinformation especially harmful, it's often when it comes from people who have a wide audience and an audience that's responsive to them. And political elites are often among the most influential figures and institutions in spreading misinformation. Their incentives unfortunately, are quite warped. The sanctions for making false statements are quite weak. Donald Trump, of course, is the canonical example now of how little it seems to matter in terms of the political support you amass if you make false statements, at least under circumstances. And so with those incentives in mind, of course, the upside to misinformation may be relatively more attractive. It may be a way to decrease support for your opponent, to activate your political base, to make a policy debate tilt in a direction that's favorable to you, etcetera. And if the only cost is people say you're making false statements in a medium that your supporters don't trust very much, that's a pretty weak sanction. Now, I wanna be very clear about this point, and we can talk about it more because it's a subtle thing. I am not saying we should roll back the first amendment.

0:19:31.4 SC: Sure.

0:19:32.5 BN: And in fact, I'm very uncomfortable with legal remedies in general. I think people have become...

0:19:43.0 BN: Quite reckless in how quickly they jump to speech suppression as a solution to this problem. And I think they should reflect carefully on how those kinds of steps could be misused by folks they don't like, if they have control over the relevant institutions, whether it's political, legal, or say control of tech companies and platforms, right, in all of those cases, think of who you don't want controlling speech. And now imagine giving greater control over speech, to that entity, right? We can all, I think, imagine the potential for misuse, and so I worry about people who say, "Well, misinformation is a problem and someone needs to make it go away. And the way to do that are legal restrictions or the platforms taking care of it, or so forth." Those are, in some cases the punishment may be, you know, the cure may be worse than the disease.

0:20:46.6 SC: There is a philosophy question here about the nature of truth [laughter] and how possible it is to get there. I forget who it was. I read just very recently some politician about Donald Trump's recent legal worries, so a Republican politician was saying that, this assumes that the government can judge what is the truth. And I'm like, "Well, [laughter], the legal system is certainly presumed to try to judge what is the truth." But I think what you're raising is that if we try to make a law saying, you know, you can't lie, you can't tell intentionally untrue statements in the media or whatever. That would be as a practical matter, very, very hard to make fair and shielded from misuse.

0:21:36.7 BN: I think that's right, I tell my students when I teach about this topic, that at any moment of philosophy seminar about the nature of truth could break out [laughter, and there are intense debates in my world about exactly where we should draw these lines. So for instance, for the fact checkers, these predominantly online journalists who evaluate the accuracy of statements made by politicians, there's frequently debate about whether their ratings are fair, if they accurately reflect the evidence, if the matter at hand is even a factual question about truth at all. In some cases it may be subjective or a matter of opinion in some important way, and these are very difficult questions that, don't necessarily have objective answers. And that's why I appreciate the way in this country, the bar for defamation and libel cases is very high, right?

0:22:36.6 BN: It's very high, and that prevents cases from being successfully litigated, unless the claims are extreme, and it also, you can't, it's very difficult to bring cases against public figures for making false statements. I guess the bottom line to me is that false statements are part of living in a democratic society. There is no, and I'm concerned about the way we went from saying misinformation is a problem to, our goal should be to eliminate misinformation, right? The price of eliminating misinformation is no longer living in a free society. And we have to accept that there will always be misinformation with us, and that we have to think about the competing values at play here. That doesn't mean we shouldn't try to address misinformation effectively, and we can talk about ways to do that, but setting up the goal of zero misinformation is taking us down a road towards an illiberal society.

0:23:52.0 SC: And this kind of problem pops up not only with politicians or bad actors who want to misinform for their own reasons, but even with, as we hinted at before, journalistic outlets that are trying to do their best to be objective, right, it can, there's certainly a well-known worry that that degenerates into both sides ism, opinions on the shape of the earth differ. So what words of wisdom do you have for a responsible news outlet that is trying to be objective and stick to the truth, but is so used to just saying, "Well party A said this, and Party B said that," that they are reluctant to say, "And party B is absolutely lying."

[laughter]

0:24:37.3 BN: Yeah, no, this has been a real, debate in journalistic circles now for a couple of decades. The fact-checking movement in some ways was inspired by the failures of mainstream journalism in this respect, the coverage, originally the seeds of this are the ad watch movement, where journalists were, frustrated with the way that the so-called Willie Horton ad was covered in the news, and it seemed to almost amplify the claims in question rather than providing adequate scrutiny. And that ad watch format is a progenitor to the fact check format, which sought to reorient the focus of journalism, at least within this framework towards evaluating the accuracy of the statements made by politicians instead of reporting what is "news" which is a subjective thing that often leads to that he said, she said, style of journalism you described.

0:25:44.1 BN: And in general, I think we have seen a kind of transformation, I would point to two issues that have really driven this change, the first is coverage of the climate movement where it became after many years of shaming, it became professionally damaging to both sides. Climate change as the evidence became overwhelming. Yeah, and then second, the sheer volume and audacity of Trump's false statements, during his campaign, and especially during his time as president, when it was simply impossible to report on Trump and not indicate clearly that he was making false statements. And both of those I think, have increased the scope of journalist willingness to describe evidence in non 50 50 terms even when contested by the parties, you still will see examples where journalists retreat to that.

0:26:50.4 BN: I think it's still an ongoing challenge, but I do think progress has been made. And that's important because empirical research suggests that when you get, when you're exposed to these news stories that say views on the shape of the earth differ, people may infer from that presentation that experts are divided, that the evidence is mixed.

0:27:10.5 SC: Yeah.

0:27:12.1 BN: That there's a lot of uncertainty. So it's important to communicate precisely the relative weight of the evidence when such an expert consensus exists, of course there are dangers of overcorrecting in the other direction too, and stating with too much certainty, claims that turn out to be poorly supported or wrong. We lived through a lot of instances of that during covid for instance.

0:27:37.1 SC: The pandemic. Yeah.

0:27:39.1 BN: Where the media rushed to state experts know X, or you know, and then that consensus would of course be quickly overturned.

0:27:46.7 SC: Well, that's the problem with science is that we try to be open-minded to new results coming in, and it's hard to convey that we're pretty confident that this is true, but we will change our minds if we learn new things. But I had Ezra Klein on the podcast a while ago talking about polarization and how we become more polarized over recent decades, very consistent with things that you've said. But I asked the question, and he was sympathetic to the idea that one thing is just politicians learn to be better game theorists. And if you take as your incentive structure, not I want to make the country a better place, but I want to win the next election, then you just act differently. Right. And you know, maybe your standards are different and whatever. Do you think that the elite political class has become better at gaming the system, including the fact, the feature of the system that journalists are trying to be objective?

0:28:44.0 BN: It's an interesting question. I mean, I'm a political scientist, so I always think, politicians are being strategic. And some of the most influential works in my field point out how much of political behavior you can explain simply by the reelection motive.

0:29:02.7 SC: Yeah.

0:29:05.9 BN: At the same time, I will say I thought there were more lines that politicians wouldn't cross, than turned out to be the case after, during the Trump years. And so if anything, I've updated my views to think that the reelection motive is overwhelmingly shaping politicians' behavior. I would've thought actually some of the policy heterodoxy of Trump also would've constituted a red line for people who are in politics for ideological or policy reasons, but we've seen remarkable flexibility on some of those issues as well, so whether politicians are behaving more strategically, it's not clear to me.

0:29:51.7 BN: I guess the way I would describe, I think they've always behaved strategically. I guess the what we've seen that I'm most concerned about is a breaching of this, of a set of norms that constrained, politicians at the national level from moving into explicitly anti-democratic, the explicitly anti-democratic realm, or normalizing misinformation in the way that Trump did. Those two trends, I think are genuinely worrisome because when those kind, when that kind of rhetoric becomes normalized, the incentives for politicians change, it was thought it was career ending to do the kind of kinds of things that Trump did. And so for sincere or strategic reasons, politicians didn't do them. They may have believed in the norm and think or thought it was bad to violate it. They may have just strategically avoided breaching the norm because they thought it would be bad for their career.

0:30:49.5 BN: But it, you know, those explanations both generate the same expectation. People won't engage in anti-democratic rhetoric or anti-democratic actions. They won't spread misinformation as brazenly. And at the volume of Trump. Now we've seen those norms be breached, and norms depend on this shared understanding of the limits of behavior. If you cross some line, you'll be sanctioned for it. And so a strategic politician may respond to the breaching of those norms by changing their behavior. And that's really what I'm most worried about. We have an ongoing debate about whether Trump is a kind of one-off figure or a harbinger of things to come. And it's clearly proven difficult for other politicians to capture whatever it is he's doing as we're seeing during the Republican and primary right now. So there are elements of his appeal that clearly are non-transferable, but I think it's fair to worry that the breaching of those norms will change the way politicians approach politics strategically.

0:31:58.3 BN: So let me give you, let me give you a, I think a simple example of this.

0:32:02.4 SC: Good.

0:32:04.4 BN: Ron DeSantis is running for president right now. It has not gone well, [laughter], we're recording this on August 23rd.

0:32:08.8 SC: It has not, right.

0:32:09.5 BN: He has struggled. He was seen as a very strong challenger to Donald Trump based on his political and policy success in Florida. There was this idea he could pitch himself as both more electable and more conservative in a conventional sense. And that would be a combination that could launch him to the presidency. He has struggled. He's not attracting nearly as much support nationally or in key primary states as expected, one reason is he has not found ways to present the case for his candidacy that are resonating with primary voters and the Republican party.

0:32:55.5 BN: And one thing we've seen in recent weeks as he's been shaking up his campaign, changing staff, changing strategy, changing rhetoric is an increasing pattern of resorting to violent metaphors or indeed the endorsement of violence itself. So first he talked about slitting the throats of metaphorically [laughter] of the parts of the federal government workforce that he said he would cut, which is a kind of troubling metaphor for talking about your plans for the government civilian workforce.

0:33:33.5 SC: Yeah.

0:33:35.0 BN: He then has now started talking about how he will under his administration, the United States will shoot alleged drug smugglers at the border and just kill them, as they approach or try to cross the border. So just a kind of explicit endorsement of extra judicial violence, right? So that's per se. And Ron DeSantis is nothing if not strategic.

0:33:57.7 BN: He's a very cerebral person. He is responding to the incentives. He faced his kind of wonky, style. "I will stand up to the woke people for you," et cetera, was not resonating. And now he starts talking about violence, and boy he's getting a response. This is one of the, this is becoming one of the lines that's generating the strongest responses at his rallies. He may deploy it at the Republican presidential debate that's going to take place tonight, and that's exactly the kind of thing I'm worried about that strategic politicians responding to these incentives may take us towards increasingly authoritarian and illiberal places if we're not careful. Because even if you just want to hold, you know, win and hold office, that may be the path to do so. Unfortunately, in the current Republican party, especially.

0:34:47.1 SC: Well, and there's a feedback loop with the specifics of our political system, right? You know, in, you're a political scientist, you know about the median voter theorem. There's this happy idea that politicians will move to the center to grab the largest number of votes. But when we are in such a polarized atmosphere, especially with such strong geographic polarization, as long as you think that, you know, your party has a good chance of winning most of the electoral votes or most of the state houses or et cetera, you're just appealing to the party and you're just appealing to get people out to vote by kind of poking at their emotions and getting them very fervent and so forth. So these are not, I don't know if this is temporary or if it's a, just a insight into how things work that where politicians seem to be moving towards extreme measures to game again, game the system. But in this case not the media system, but the particular electoral system we have here in the us.

0:35:53.0 BN: Yeah. I think political scientists are increasingly troubled by the intersection of geographic polarization in our two party system. The incentives to try to appeal to wider audiences are quite limited. Most members of Congress are at greater risk of losing a primer than they are of losing a general election. And under those circumstances, it's not surprising, although extremely disappointing, that so many Republicans would not vote to disqualify Donald Trump for running, from running for office again after the January 6th insurrection, if they had taken that step, we wouldn't be in the position we are today. But given the electoral incentives they face, of course, that may be an entirely rational strategic response, similarly at the national level, we're so closely divided and that, the downside to taking extreme or anti-democratic actions is relatively limited, for all the things that Donald Trump did, his approval rating moved in a very narrow band.

[laughter]

0:37:05.6 SC: It was really remarkably stable. Yeah.

0:37:07.5 BN: Because so many, so many Americans were locked into how they felt about him... Pro income, and we've seen the same pattern under Joe Biden that is not a specific, that is not specific to Trump, that the two party system with the levels of polarization we've seen now has created a circumstance where there's not much downside risk. Your supporters will stick with you almost no matter what, and it's very difficult to win over the other side, and if you look historically in that unusual mid 20th century period we were describing earlier, you'll see wild variations in approval ratings that we have not seen, now in many years, the last time we saw it was after September 11th. And ever since we've been in the... The presidents have been, approval ratings have been moving in these very, very narrow bands.

0:38:00.1 BN: One other point I would make is just simply that the two party system itself has a dangerous sort of zero sum logic to it, anything I do that helps the other side automatically hurts mine, and that's why I think we're seeing more political scientists who are open to changes in the American system to move towards multi-party governance. That would change the zero sum logic that would create a way for Republicans who are uncomfortable with the misinformation and anti-democratic tilt of Trump and his acolytes to have a viable path forward, electorally, right. A world where there's a center right party, as well as a far right party, would provide a home and a potentially sustainable path towards electoral security for the Mitt Romneys of the world and the George W Bushes and John McCains. And that whole swath of the Republican Party right now, our electoral institutions make it very difficult for those people. We're seeing many of them retiring and seeding the field to stronger supporters of Trump. And that's gonna be a challenge for the Republican party going forward. Again, just due to the strategic behavior, people respond to the incentives they face.

0:39:22.7 SC: Let's move on a little bit from the elite politicians here because they're not the only issue. We have Facebook and social media algorithms and so forth that have absolutely contributed to the spreading of dis and miss info. It I guess I can ask an analogous question there. Is that maliciousness or are people just or people are the companies running these platforms just doing what they think is best and there's sort of a inevitable spiral that we're in?

0:39:54.7 BN: Yeah that's a tricky question. You know the platforms are very different especially in a world where Elon Musk controls Twitter. Speaking about Twitter and Facebook in the Pre-musk era though I think it's fair to say that the platforms were trying to do their best and in general there were at least people within the companies who were working very hard to try to address concerns about the spread of misinformation on the platform. It turns out to be very difficult to identify and effectively counter misinformation on platforms. You really have to think about the kinds of trade-offs we described earlier. The interventions there have been misguided interventions in terms of limiting the spread for instance of information about Hunter Biden's laptop or various claims about Covid that turned out to be true that I think don't look very good in retrospect and we should bear that in mind when we when we think about the platforms intervening more aggressively.

0:40:57.0 BN: At the same time there are lots of there there are many cases where the platforms were negligent in addressing potential vectors of harm on the platforms that. You know we talked at the beginning about why now might not be unique but it is of course true that the platforms enable false information to spread faster than it ever has before. And that can be quite powerful. You know when something really spreads in that viral manner that can enable false information to reach people very rapidly. It can also surface false claims from the kind of digital grassroots that are then popularized and spread by prominent news outlets and political figures.

0:41:54.8 SC: I've we all we all hear about the fact the idea that people now have filters and they live in bubbles 'cause they can pick and choose their own news sources. But I've recently read claims to the effect that it's not true that we live in filters. It's just or we live in bubbles. It's just true that we are very good at ignoring the news we want to ignore and and so we sort of effectively filter out ourselves no matter what the world is giving us.

0:42:25.6 BN: Yeah. I've been a a a skeptic of claims that we that echo chambers are predominant. You hear these claims about echo chambers and filter bubbles and so forth and the worry is that digital media make it especially the intersection of digital media and particularly social media and polarization have created a circumstance where people are predominantly living in echo chambers or filter bubbles. Right. The empirical data we have which is unusually rich now because of the kinds of digital behavior data that can now be collected shows that those sorts of claims are not well supported most people's information diet is relatively balanced on social media. People are often encountering information they don't like. Relative to for instance the political mix in the area you live or the friends and family you have.

0:43:25.4 BN: You're probably encountering much more discordant information online to make this more precise I've conducted I conducted a study after the 2016 election when there was this the the panic over fake news and we found that the untrustworthy websites which were a particular form of potentially harmful content people were worried about and people thought were creating these kinds of echo chamber effects. Those were heavily consumption of those websites was heavily concentrated among a small portion of the population. Something like 20% of the of Americans that 20% of Americans with the most conservative information diets online were responsible for about 60% of the exposure to those sites in our data. We've seen even more extreme estimates for exposure to Russian misinformation content during the 2016 election exposure to so-called fake news on Twitter.

0:44:22.2 BN: Those were again concentrated among very small subsets of the public. We recently did a study of YouTube exposure to the potential the most worrisome channels on YouTube. Again overwhelmingly concentrated in a very small percentage of the public. So in general echo chamber claims are overstated in particular exposure to these kinds of potentially harmful content seems to be concentrated to these narrow substance of people who already have quite strong or extreme views. So I think that worry is misplaced. We should think instead about how exposure to this kind of content might generate important harms in the world even when it's being done by a small minority. Right. So you might think of things like January 6th for instance. Right. The ways that digital technology helps enable the mobilization of people.

0:45:16.4 SC: Yeah.

0:45:17.1 BN: Who already have extreme or fringe views.

0:45:20.4 SC: That's a word. Yeah.

0:45:20.5 BN: The way digital technology might inspire acts of violence in the real world or racial or ethnic hate things like that that translate the kind of latent sentiments that are out there in the world in these extreme pockets of the population and help them manifest in harmful ways in the world. I'm more worried about that than I am about the average person being trapped in echo chamber or filter bubble. The average American has better things to do. [laughter] To be perfectly honest with you they are not spending hours and hours in so-called rabbit holes or reading tons and tons of politics. They don't follow politics that closely. Even again that doesn't mean there aren't reasons for concern about digital media. There are but we should be really precise about it. And this these kinds of loose claims based on anecdotes I think have led us down the wrong road in thinking about all of the platforms and the kinds of harms that they could generate.

0:46:16.5 SC: And it brings up the complicated issue of how people actually form their beliefs. It's certainly not true. They're just exposed to claims and believe them. It's not even true that they are good Bayesians and have priors and update things. Right. I mean the how do we judge the relationship between what people are hearing and what they're choosing to accept?

0:46:40.0 BN: Yeah. This is an area of ongoing research and I think what we're tending to see the emerging story from the research suggests that when you expose people to corrective information directly people will tend to update their views at least in part in the direction of the information that they're exposed to even if it's counter attitudinal and that's a kind of encouraging finding. The perhaps discouraging finding depending on your point of view is that those effects may not be durable and they may not lead to the changes in attitudes or behavior that people expect. Right. Often there's an implicit model people have in their head that as human beings we reason from facts to opinions or attitudes and to behavior right? We learn things about candidates we have factual understanding of the validity of their statements. Then we update our opinion about that candidate.

0:47:38.5 BN: Then we decide whether to vote for them. We learn things about covid then we have opinions about covid policy then we go decide if we're gonna get a vaccine or not. But it turns out the direction of causality is not necessarily clear. In many cases our factual beliefs may be reflections of the opinions we hold or the behaviors we choose to engage in, they it may also be the case that there are a number of reasons we hold those opinions or engage in those behaviors and changing our factual beliefs may not necessarily have any effect at all. And so we'll often see cases where people say, "Yeah I understand Trump didn't make that claim. Sorry Trump made a claim that was false. I accept that claim was false. Doesn't change how I feel about it."

0:48:24.4 SC: Don't care. Yeah. [laughter]

0:48:27.4 BN: "This false claim about covid okay I accept that it's false but I'm not gonna change how I feel about Covid policy about whether I should get a vaccine." And you know and importantly this goes back to your point about philosophy earlier. It's not necessarily the case that you have to that it's logically entailed that you must change your opinion or behavior because of that particular fact. Right? You you as an observer may have a particular belief system where you think one should make that update their opinion or or change their behavior in that way. But of course it's not necessarily so it just it simply depends on how you weight the relevant considerations which of course is objective.

0:49:11.3 SC: No I think this is a great point. I think that people are a little bit too quick to attribute to irrationality that which is better explained just by trying to make sure everything kind of fits together in some way when it when it comes to teaching physics classes or giving public talks about physics. I when people ask me how to do it my first thing is don't imagine that your audience are empty vessels into which you are pouring your wisdom. And to me it might make perfect sense that someone discounts a certain fact even though they think the fact is true because there's a whole bunch of other things that they believe that are gonna still point them in another direction.

0:49:48.6 BN: No I think that's right people have reasons for what they do. It may not be at the ones that you [laughter] would like them to have but it's not simply a mat right? So the I think the the the mistake is thinking if we pour accurate factual information into people they will come out with the opinions and behaviors you would like.

0:50:12.5 SC: Yeah.

0:50:13.5 BN: And that that just rarely turns out to be the case in any of the areas I study.

0:50:18.8 SC: And look we all do it. There's left-wing conspiracy theories there's right-wing conspiracy theories et cetera. Is anything from your research giving you a little bit of insight into how we personally can sweep our own doorsteps and try to do better at separating the true news from the fake news?

0:50:38.8 BN: You know I wish I had a silver bullet and [laughter] and I don't and you know if anything the emphasis that I tend to recommend is on elites and institutions because human nature is what it is and it's not going to change except in evolutionary time. So faulting people for being human beings to me often leads us down a road that generates a kind of elitism and condescension I don't like in these conversations. I think people are being failed by the elite they trust. They're being failed by the institutions that fail to give them the information they need to form more accurate beliefs. And so we should challenge the media to do better. We should challenge politicians to make accurate statements and so forth. With that said what you can do you know trying to rely on trustworthy sources of news and information of course will improve the likelihood that you're exposed to accurate information.

0:51:35.8 BN: They won't get everything right but on average they will do better. It's also the case actually a finding that comes out of recent research that I think is useful is that it's important to take accuracy into consideration when you're interacting with news and information online. There's a stream of research that basically finds that just prompting people to keep accuracy in mind before they encounter social media content seems to help them make judgements that better reflect the accuracy of what they're interacting with. Because it turns out there are lots of reasons we decide what to share and what not to share. [laughter] And accuracy is only one of them. So simply slowing down and thinking about is this accurate or not in the moment is a small but useful thing you can do that will help you do better.

0:52:34.5 BN: Now I'm not sure how scalable that is. If every time you went on social media they said accuracy accuracy accuracy eventually you'd tune them out the way you do any of those kind of repeated reminders but on the margin it's a nice thing to do. There are also some pretty useful tips for discerning accurate information online that are sometimes distributed. I've tested the ones that Facebook rolled out after 2016 and they were surprisingly effective. Just here are a set of simple rules you might keep in mind right? When you encounter a news headline is this true? Does this seem too good to be true? Are they using emotional language in a way that's designed potentially designed to inflame me? Is this what's the source of this information? Just a series of simple questions like that you might ask about the information you come across seems to help people do a bit better in discerning what information is valid and what information is invalid.

0:53:30.0 SC: It also seems to me that since we are very polarized these days and people just perceive the other side as the enemy no matter which side they're on they're we're a little too quick to pick up claims and purported news stories that make us feel good or make the other side look bad. Right? I mean some kind of that you already mentioned it's too good to be true but just thinking, "Well I like this. It makes me feel superior. I want it to be true." That's something that we should be more skeptical about rather than less.

0:54:01.4 BN: Absolutely. Right. If you can if you if we're to the extent that we're able to reflect our own biases which I think is a challenge for every person if this is something that you realize you might be gravitating towards for those reasons then taking a second look might be important. And I'll give you an example of a design feature on a platform that I I think tries to address this in useful way. I think this was undone by Musk but earlier on Twitter when you try to retweet an article with a link and you hadn't clicked on the link [laughter] Twitter would ask you, "Hey do you wanna read that story first?" And my suspicion is that in many cases people were hammering that link without even reading the story when it was consistent with their prior beliefs.

0:54:45.5 SC: Yeah.

0:54:47.0 BN: When it was consistent with their understanding of who the good guys and bad guys were and who's right and who's wrong and so forth. If you ever for those of you who have a Twitter account you can look at your own analytics data and see people almost never click the link.

0:55:01.1 SC: Yeah. [laughter]

0:55:01.9 BN: And so I worry that we're jumping to the kinds of conclusions you're describing. And using those button those retweet and share buttons as an expression of affiliation with a side or an idea. And that's where I think we can easily go astray.

0:55:18.4 SC: Okay. I know you have a deadline. So the last question will be a quick and easy one. How would you fix democracy [laughter] or I mean you already mentioned the idea that political scientists are playing with ways to make it easier to be centrist less extreme et cetera. Or I don't know whether you care about voting systems and things like that. I mean is it time for big structural changes in how we elect our leadership?

0:55:43.3 BN: I'm ready to consider them and I wasn't there a few years ago. I'll recommend to your listeners Lee Ruttman's work in this area. He has a book called "Breaking the Two Party Doom Loop" That I think is a really nice introduction to why we should reconsider the two party system and how we can start to do so. He's continued to work in this area. If you if you're able to find his work at the New America Foundation he's really at the cutting edge. People are experimenting with lots of different potential changes. The one that's gotten the most interest is rank choice voting. I don't think that will be enough but I just want to go back to the kind of the two big ideas we've we've talked about when you get out of a two party framework it gets us out of that binary zero sum thinking that I think can contribute both to the endorsement of anti-democratic statements and actions and also creates conditions that are ripe for the spread of misinformation. Right. When you're in a good guy bad guy framework your side is right and the other side is wrong multi-party systems have their challenges but there's a reason that we are quite unusual among our peers in our system of government. And I think it's time to restructure how we do things. How we get there is a very difficult challenge given the nature of the Constitution. But that is a topic for another podcast [laughter] So thank you for having me.

0:57:07.7 SC: Yeah. I do worry that when we start changing the Constitution things will get worse. So there's a certain conservatism there that makes sense to me. But thanks very much you've given us a lot to think about Brendan Nyhan thanks for being on the Mindscape podcast.

0:57:19.9 BN: My pleasure.

1 thought on “250 | Brendan Nyhan on Navigating The Information Ecosystem”

Comments are closed.

Scroll to Top