143 | Julia Galef on Openness, Bias, and Rationality

Mom, apple pie, and rationality — all things that are unquestionably good, right? But rationality, as much as we might value it, is easier to aspire to than to achieve. And there are more than a few hot takes on the market suggesting that we shouldn’t even want to be rational — that it’s inefficient or maladaptive. Julia Galef is here to both stand up for the value of being rational, and to explain how we can better achieve it. She distinguishes between the “soldier mindset,” where we believe what we’re told about the world and march toward a goal, and the “scout mindset,” where we’re open-minded about what’s out there and always asking questions. She makes a compelling case that all things considered, it’s better to be a scout.

Support Mindscape on Patreon.

Julia Galef received a BA in statistics from Columbia University. She is currently a writer and host of the Rationally Speaking podcast. She was a co-founder and president of the Center for Applied Rationality. Her new book is The Scout Mindset: Why Some People See Things Clearly and Others Don’t.

[accordion clicktoclose=”true”][accordion-item tag=”p” state=closed title=”Click to Show Episode Transcript”]Click above to close.

0:00:00.4 Sean Carroll: Hello, everybody, welcome to the Mindscape podcast. I’m your host, Sean Carroll, and today here on the podcast, we are going to get rational. I know that a lot of you are thinking, Well, it’s about time, but we always try to be rational, right? As I make the point early in the podcast, there are very few people who actively brag about not being rational, but it’s easier said than done. We would like to be rational, but how do you do it, how especially do you admit that you are not always rational and reorient your mindset in a way that helps you be more rational, whether it’s in sort of high level problem-solving for some scientific or technical job you have, or just in your daily life, who to date, what to have for dinner tonight.

0:00:40.4 SC: We would like to be rational about all of these different choices, and so today’s guest, Julia Galef, is a star in the rationality community, someone who has given a lot of thought into these real efforts that an individual can put into becoming more rational themselves. She doesn’t want to just give you a list of ways in which you’re not rational, that’s been done quite a few times, she wants to give you useful actionable advice for becoming more rational yourself.

0:01:07.6 SC: So she has a new book coming out called The Scout Mindset. Julia’s metaphor here is that in a war, there’s a military metaphor going on here, so you have soldiers, the soldier’s job is to say, look, I have a goal, I’m going to achieve that goal, nothing’s going to stop me from achieving that goal. The other mindset you might have is that of a scout. The scout doesn’t have a goal to sort of take over a certain installation or whatever, win a certain battle, the scout’s mindset is to gather all the possible information with an open mind, to be as open as possible to all the different pieces of information, whether or not they reinforce what the scout already believes, and so Julia is hoping that we can all develop more of a scout mindset.

0:01:50.5 SC: So a lot of the podcast episode is going to be exactly that, how do you develop this scout mindset, how do you actually train your brain to be open to all the information. But there’s an interesting sub-conversation we have about justifying what seems to be obvious, namely that being rational is good. So there are people out there, both explicitly and implicitly, who make arguments that, you know rationality is overrated, that in fact, there’s something maybe evolutionarily useful in being irrational in the right way, and so Julia is going to defend the position that we should be rational in the right way all the time. And that’s an extreme position, but I think I’m pretty much in favor of this one, I think that we’re on the same side, so I try my best to play the devil’s advocate here, but overall, I’m more or less sympathetic.

0:02:37.8 SC: So occasional reminder, we have a web page for the podcast, preposterousuniverse.com/podcast, which is a good place to go because we have full transcripts for every episode, and they’re searchable, you can search for any word that has ever been said on Mindscape and find it in the transcripts on the website. And the other thing is, we like it when you leave reviews of the podcast on places like Apple podcasts and things like that, iTunes reviews, they help raise the visibility of the podcast. If you think that the podcast is a special experience that you want for yourself and don’t want to share it with anyone else, then don’t bother, but if you think that Mindscape is something that would be great if other people were listening to, then raising the overall level of exposure is a good thing. So I’m very grateful to everyone who does that. So let’s go.

[music]

0:03:42.6 SC: Julia Galef, welcome to the Mindscape podcast.

0:03:44.4 Julia Galef: Thank you. So great to be here finally.

0:03:46.8 SC: Like many of my guests, you have put yourself in a position, apparently intentionally, so what I mean by that is you’re defending the idea that we should all be rational. I had David Baltimore on the podcast a while ago, he’s a Nobel Prize-winning biologist, and no one knows this yet… Sorry, David, but while we were talking before the podcast, we had trouble getting his microphone to work, and I said, once you win the Nobel Prize, don’t you constantly get crap for not being able to get the microphone to work. Do people say like, oh, I thought a Nobel Prize winner should get to do that. And he’s like, oh, yeah, you have no idea. It’s just like that. So similarly, don’t you open yourself up when you’re being pro-rational as an explicit position, like every single thing you do, people are going to say, well, that wasn’t very rational, was it?

0:04:34.9 JG: That is, yes, that is true. And that is despite the fact that I try to emphasize that I definitely don’t think of myself as perfectly rational, I don’t think anybody is perfectly rational, and I also try to have kind of a light touch in the way in which I’m pushing for rationality. So for example, what I mean by that is some advocates of rationality or intellectual honesty or other things in that space, they will often claim explicitly that everyone… It’s always better to be rational or it’s always better to see the truth, and I don’t think that they can actually support that. I’ve never heard anyone give a defense of that claim that seems sufficient for how strong a claim that is, like how do you know that everyone’s always better off being rational? How do you know that everyone’s happiness is maximized with a rational view of the world in every situation. That’s a strong claim.

0:05:31.7 JG: So I try not to make that strong claim, and instead I will make more nuanced claims that I think I actually can defend. I think we can be pretty confident that people would be better off being at least somewhat more truth-seeking than they are by default, and I can explain why I think that’s the case, that on the margin, we would be better off with more truth and less self-deception, but that’s different from claiming that everyone is always better off all the time being rational.

0:06:00.8 SC: That is true, but as someone who’s studied a little bit of the human psychology required to try to be more rational, I’m sure you understand that the arguments that people are responding to are not only the arguments that you’re actually making.

0:06:13.8 JG: This is true, this is true, and that’s something that… I mean, it’s always a little bit frustrating when people aren’t hearing the thing that you intend to be saying, but I’ve made my peace with it to a large extent in the last few years, just by… Basically, I just realized that communication is always imperfect and always involves some amount of making assumptions about what the other person meant to say. And so if most other people in a position similar to mine, if most of the people who are advocating for rationality or truth or intellectual honesty, if most of them are saying a more extreme thing, then it’s kind of understandable how my listeners, unless they’re really going out of their way to be super careful, which most people aren’t most of the time, it’s understandable why they would hear the extreme thing, even if I’m not saying the extreme thing.

0:07:06.6 JG: So I’ve just… I’ve made my peace with accepting that people are going to hear the thing I’m not saying, and I just kind of preempt it to whatever extent I can and then try to explain what I’m actually saying.

0:07:15.4 SC: That is very admirable. We should always aim to be that generous to our listeners, but anyway, I don’t want to worry too much about the flak you’re getting for this, because in some sense… Let me come from the opposite direction, I mean, is there anyone who self-professes to not be rational or to not even try to be rational or not value rationality? Isn’t rationality is one of those ultimately benign values that everyone thinks they are upholding?

0:07:43.5 JG: So I understand why you say that, because there are tons of people who claim to be rational or prize rationality or prize you should always change your mind in response to the evidence, like I always change my mind in response to the evidence, and I think very few of those people come close to upholding the value that they’re professing. And in fact, I learned this the hard way when I was researching my book, because I would find these quotes… I was always looking for good quotes or good examples for the book, and I’d find these amazing quotes about intellectual honesty, saying things like, it’s so important to change your mind when you notice you are wrong, and then I would look at the source of the quote and I’d be like, this is…

0:08:25.0 JG: So I’ll just give you one example, I forget the exact quote, but it was something like, I always… It’s so important to change your mind in response to the evidence, and the source of the quote was Rudy Giuliani, so I was like, I can’t put this in my book. So I get what you’re saying, and I basically agree, except that there actually are a bunch of people who explicitly reject the principles of rational thinking or intellectual honesty, and I think they kind of fall into two categories. One category is the kind of post-modernist or maybe in a sense romantic school of thought where there really isn’t any objective truth and all that exists is different people’s subjective realities, and any attempt to try to ferret out the objective truth or try to figure out why two people disagree is doomed to failure, or it’s an agent of white supremacy or bigotry to even talk about there being in theory a correct answer.

0:09:34.5 JG: So that’s one sense in which people disagree with me. And then another sense is, I think just kind of a classic, like a feeling that it’s more noble to stick to your guns and never change your mind. So there’s a metric, a scale in cognitive science, called active open mindedness, which was invented and popularized by Jon Baron, and it’s just a questionnaire about what… How do you think people should think. So it’s not really measuring how you do things, but it just, what do you think good thinking is, and so it includes questions like, should people change their minds when they encounter evidence that contradicts their beliefs. So this is the kind of questionnaire where you would think, of course, everyone would say yes to all these questions, but in fact, a lot of people say, no, you shouldn’t change your mind. And I think they’re thinking of, I don’t know, deeply held political or religious beliefs or something, and they see value in just sticking to your guns, no matter what.

0:10:32.5 JG: So that’s another kind of person who I think disagrees with even the theory, like the theoretical idea of trying to adjust your beliefs to the evidence.

0:10:40.5 SC: Well, I think that second category interests me. I suspect that the first one is pretty thin on the ground, like you have to really work hard to find people who actively think that being rational or finding the truth is an instrument of oppression. I think that the existence of such people…

0:10:58.5 JG: I don’t know, they do a good job of finding me, I don’t feel like I have to look that hard for them.

0:11:01.8 SC: Well, exactly, I think that the few examples that there are might stand out, it’s like using Twitter as a proxy for public opinion, right. You’re not hearing the unbiased sample in some sense. But the second one is interesting to me because I can almost see it. I mean, I don’t agree with that attitude, but I get where they’re coming from in some interesting way, and I think we’ll get there down the road. But I do have this feeling that when people say, let’s be rational, you can have two sets of people who are both saying, let’s be rational, let’s stick to the evidence, let’s draw conclusions based on reason, even though they utterly disagree with each other, and they’re using it to bash the other side. So in some sense, do we agree on what it means to be rational for one thing, is there a sense in which all right-thinking people know what they mean when they say let’s be rational?

0:12:00.9 JG: No, I think there’s tons of disagreement about that, about what people mean by that word. Some people mean something like… I don’t… It’s so hard to even try to summarize what different people mean by this. I think in practice, what a lot of people mean is, you should believe what I believe. And so anything that disagrees with me is not rational, that’s like a common colloquial interpretation of the word. There’s kind of technical definitions you could give from philosophy or economics or computer science, even, of what a theoretical rational agent would… How a theoretical rational agent would update their beliefs in response to new evidence, or how a theoretically rational agent would make decisions to maximize their utility, like to pursue their goals effectively.

0:12:54.4 JG: But any attempt to apply those definitions to actual messy human reasoning in the real world is going to involve… It’s going to be messy. So I don’t try that hard to come up with a definition of how to think rationally that is all-comprehensive. But there are heuristics. I think it’s pretty arguable that trying to at least sometimes look for reasons you might be wrong is probably going to make your thinking more rational than if you didn’t do that at all. These seem like pretty weak claims, like well, who would disagree? But I think that the disagreement is just in practice, how often do people actually try to live out these principles. So as you pointed out, there are tons of people who will agree like, yeah, of course, of course, one should do these things, but then in practice people rarely do. So I’m more arguing that we should try to do these things, that you probably agree already in principle are good.

0:13:53.8 SC: Well, you already alluded a couple of times to the idea of being willing to change your mind in the face of evidence, and this is closely related to the constellation of ideas that goes under the umbrella of Bayesian reasoning, right. So let’s assume that there are people in the audience, it’s probably not very likely, but there are people in our audience who don’t even know what Bayesian reasoning is, have never heard the word Bayes. How would you talk about that way of thinking?

0:14:18.7 JG: Sure. So Bayes’ rule is just a simple theorem in probability theory about basically how you should revise your confidence in a particular hypothesis in response to new evidence. And it’s pretty simple, it’s just your confidence should go up in proportion to the ratio of how likely is that evidence, how likely would you guess that evidence is in a world where your hypothesis is true compared to how likely would that evidence be in a world where your hypothesis false. So, of course, in the real world, we don’t have perfect information, we don’t know exactly how likely is it that this paper in support of wearing masks for Corona virus, how likely would that paper be to exist in a world where wearing masks helps compared to a world where wearing mask doesn’t help. We can only guess at those things.

0:15:16.3 JG: So in practice, Bayesian reasoning just means kind of making an attempt at least to make these guesses and at least get a sense of like the order of magnitude, like is this paper a lot more likely in the world where wearing masks helps than the world where it doesn’t. If so, then it’s kind of strong evidence in favor of wearing masks. Or to take another example, if you read an article about how a woman in a tech company was passed over for a promotion because she was a woman, how strong is that evidence in favor of the hypothesis that the world of tech is really sexist? I would argue it’s not that strong evidence, just because when I imagine the world in which tech companies are not more sexist than average, I think even in such a world, it’s pretty likely that there would be at least one case of a woman who was passed over for a promotion due to her gender, just because in any big industry, that kind of thing’s going to happen at least sometimes. And so I would argue that that article doesn’t provide very strong evidence for that hypothesis about tech being sexist.

0:16:18.8 JG: So that’s one aspect in which I think trying to be Bayesian, trying to at least be mindful of Bayes’ rule when you’re evaluating evidence leads to thinking that even though it’s not precise and mathematical, it at least is I think an improvement over a default human reasoning.

0:16:35.0 SC: Yeah, one of the examples of this kind of reasoning that I find… Like the basic idea that, yes, if the data you’re collecting is more likely in one hypothesis than another hypothesis, you should increase your credence in the one it’s likely under. Okay, most people are going to go along with that at an intuitive level, but…

0:16:54.0 JG: In theory. In theory.

0:16:58.7 SC: Well, what I mean is most people will say they’re going to go along with it rather than actually going along with it.

0:17:04.4 JG: Yeah, well, they’ll say they’re going to go along with it, but then in… And maybe this is where you’re meaning to go with this, I don’t mean to preempt you, but I think there are a lot of situations where people explicitly reject that reasoning, if the context is like… Well, I don’t know exactly in what context people reject it, but for example, I’ve talked to people who will agree with that reasoning in theory and in some contexts, but then when we’re talking about… I don’t know. Something political like, is Trump going to win or could the Democrats be right about immigration or something, then they will insist that nothing could change their mind or any evidence that comes up, even if I think it’s way more likely in the world where the Democrats are right, they will insist that it’s zero evidence at all in favor of the Democrats being right. So they will reject the idea of trying to do that kind of estimate in their head, if it leads to conclusions that would force them to downgrade their belief in a particular strongly held hypothesis.

0:18:03.6 SC: Well, the specific kind of mistake that I was going to mention is when there is some evidence, and I would argue that it does go against someone’s belief, and all that they think is necessary is to say no, I can account that evidence under my belief. And what they don’t do is say, well, if the evidence had been the other way, I would have counted that as huge evidence for my belief. And I think it’s a theorem that if the opposite evidence would have increased your belief, then this evidence should decrease it.

0:18:35.2 JG: Exactly, right. No, that’s a great example of a common sense implication of Bayes’ rule that people very often neglect, and that even if you’re not plugging numbers into a formula, just being aware of the rule and the fact that it implies that if X would have been evidence that you would update on, then the opposite of X should, it has to force you the other way. Even that can I think provide a really useful corrective on your intuitive reasoning on its own.

0:19:04.8 SC: But what you’ve explained… Go ahead.

0:19:07.3 JG: Sorry, I was just going to say an example of this that maybe you’re familiar with, but maybe some of your listeners haven’t heard, is during World War II, there was a lot of concern that Japanese Americans were going to betray the US in support of Japan, and Governor… I think it was Governor Earl Warren in California, was one of the strongest proponents of this Japanese-Americans are treacherous hypothesis, and he was presenting this testimony to Congress that we should intern Japanese-Americans, and someone points out there hasn’t really been any evidence of subterfuge from Japanese-Americans. And he said, ah, well, I take that to be even more confirmation that they are planning subterfuge, because they’re being so sneaky about it, that they’ve managed to hide all the evidence, and you’re just like, okay, so it’s evidence for your view both if there is and isn’t any sign of subterfuge.

0:20:01.0 SC: Well, yeah, I mean, the way that we see that all the time on social media is everyone is criticizing me, therefore, I must be on to something, right?

0:20:08.6 JG: Right. No, that’s a great… Exactly, right. So you’re saying that if everyone was agreeing with you, that would mean you were wrong, like maybe in theory you believe that, but practice, I’ve never seen anyone actually act that way.

0:20:19.7 SC: But I think what we talked about so far is sort of half of Bayesian reasoning right, like you have some beliefs, you collect more data and you update them, but then the other half is you have some beliefs, right. The people who don’t like Bayesian reasoning always object to the idea that we need to start with a prior and that seems kind of mystical. Where do you get these prior opinions from? Do you have a spiel on that or do you have a take on how it’s necessary to have some prior credences on things?

0:20:50.8 JG: No, not really. I think the claim that trying to be Bayesian or trying to be mindful of Bayes’ rule when you’re encountering new evidence, the claim is not that that will lead to 100% accurate beliefs, it’s just that relative to not using it, it will lead to more accurate beliefs. So everyone, we all start out with prior beliefs and assumptions that just come from, whatever, they come from where we grew up, and the people we talk to, and the whole collection of all the sources we’ve read and the experiences we’ve had, and the entirety of our life experience has led to us having a world view. And then the question is just given that fact that all of us have these prior beliefs that we can’t prove are true, that are probably wrong in lots of ways, given that fact, conditional on that, will we be better off if we then try to use Bayes’ rule in updating on new evidence, like from that starting point of our subjective and probably flawed beliefs.

0:21:53.8 JG: And I’m claiming that I think we probably will be better off if we update in a roughly Bayesian way than if we don’t. But that doesn’t mean we will be 100% right, it just means we’ll probably be more right than if we weren’t using Bayes’ rule.

0:22:06.6 SC: That sounds like an admirable goal. There are a lot of obstacles to that goal, and you point out some of them. I’ve seen you put up this slide that you call the most depressing graph in the world…

0:22:18.0 JG: Oh, yeah, the graph of despair, as I call it.

0:22:20.7 SC: The graph of despair. Why don’t you explain that.

0:22:23.8 JG: So this is a graph made by Dan Kahan, who is a Professor of Cognitive Psychology at Yale Law School, and it was part of a series of studies he did in which he measured… There was a construct that he called, I think he called it science intelligence, but really it was just kind of a collection of measures of people’s scientific knowledge and basic analytical reasoning, like can you do very basic logic puzzles and do you know the answers to basic scientific questions like which is… Is the air in our atmosphere mostly oxygen or mostly hydrogen, things like that.

0:23:05.6 JG: So he collected this measure, and he also collected measures of people’s political affiliation, like liberal or conservative, and their views on various ideologically charged scientific questions like climate change. Do you agree that the Earth is getting warmer, at least in large part due to human activity. And so what he found was, first of all, the unsurprising part was that political views are correlated with your views on climate change, that’s…

0:23:35.8 SC: We all know that, yes.

0:23:37.7 JG: Unsurprising, right. But the surprising part and the part that made me call this the graph of despair was that If you order people on this graph by scientific intelligence, at the low end of scientific intelligence, there’s very little correlation between political views and climate change, so like low scientific intelligence, conservatives and liberals are both at, I don’t know, something like 30% or 40% agreement with climate change, but then as you go up the scale, as people increase in scientific intelligence, the correlation between political views and climate change use increases, and so the lines of how conservatives feel about climate change versus how liberals feel, they diverge as you go up the scale of scientific intelligence, until in the top tier of the top 1% of scientifically intelligent people, there’s just wild disagreement over climate change, where Democrats are at almost 100% agreement and conservatives have fallen from 30% or 40% down to about 20% agreement with climate change.

0:24:39.7 JG: And so it’s despairing, because there’s this view, this kind of idealistic view, I guess, that if everyone just gets more education in science and more training in logic and critical thinking, then we will all be able to agree on the truth, and actually the graph shows the opposite.

0:25:00.4 SC: Well, the way that you… I don’t remember the numbers from the graph, but the way that you reproduce them right there, it sounds like the conservatives did not decrease by that much in their belief in anthropogenic climate change…

0:25:15.7 JG: That’s true, yeah.

0:25:17.2 SC: Whereas the Democrats would increase by a lot.

0:25:17.3 JG: But the idealistic theory is that they would have, is that everyone would have converged, and so even decreasing a little bit is kind of shocking if you expected them to converge in the other direction.

0:25:25.7 SC: So simply being more knowledgeable or more educated doesn’t make you more rational in that particular way or not as much as we would like.

0:25:35.8 JG: Yeah, so there’s a popular interpretation of this graph that I think is wrong or that I don’t think is supported, and the popular interpretation, which I think even Dan Kahan, the creator of this graph, has probably said on occasion, is that being smarter and being more knowledgeable makes you better at finding defenses of the things you want to believe, so he’s drawing a causal line in between your scientific intelligence and reasoning ability on the one hand, and your ability to self-deceive along with, in keeping with your political views, on the other hand.

0:26:10.4 JG: I don’t necessarily think that causal line exists. To me, the most plausible explanation for the graph of despair is just that people with high levels of scientific intelligence tend to be more politically engaged, they tend to be more educated and more educated people tend to care more about politics, and the more you care about politics, the more hyper-aware you are of what your side is supposed to believe and what the other side is not supposed to believe. And so it’s not that you are better able to deceive yourself, it’s just that you… On this particular topic, you have more of a motivation to get the politically correct answer for your side, whereas the people with “low” scientific intelligence just aren’t really that engaged in getting the politically correct answer.

0:26:54.7 SC: They haven’t been told. Yeah, well, I think both of these hypotheses sound actually plausible to me, I’d be interested in collecting more data so you could control for these questions of political engagement and so forth. I do remember, Ezra Klein was on my podcast and we talked about political polarization, and he made the point that it certainly doesn’t go away with increased education, it’s sort of the opposite. In fact, it’s a broader theme where the pithy statement is, nobody knows more about the tensile strength of steel beams than 9/11 truthers. You can really, really, really be an expert in something and have a completely crackpotty conspiracy theory point of view on it, so I think that sounds more like the position you’re pushing against a little bit.

0:27:43.6 JG: I don’t know if that’s in tension with what I’m saying. Basically what I think we can conclude from things like the graph of despair is that at the very least intelligence and knowledge on a particular subject does not protect you from motivated reasoning on that subject. Then the question of whether it actually makes you more inclined to motivated reasoning is kind of a separate question that’s trickier to figure out, but I think it’s even important just to say the weaker claim that it doesn’t protect you, because people tend to feel like it does protect them, that I’m an expert on this topic, therefore, my reasoning is going to be really good and objective and my answers are all going to be correct. Well, if it’s a topic that you feel kind of ideologically passionate about, then that’s probably not true.

0:28:30.4 SC: In some sense, correct me if I’m wrong here, but this is an example of you changing your mind. Didn’t I read maybe, I think in your book, that at first when you first got into this set of questions, you were hopeful, like many people are, that better training in logic and probabilistic thinking will make everyone more rational. And you’ve kind of moved away from that. You have this amazing quote that I want to get a tattoo of: Our judgment isn’t limited by knowledge nearly as much as it’s limited by attitude.

0:29:01.2 JG: Right, yeah. That was kind of a hard lesson I learned over the course of several years, just observing people and observing myself and noticing that knowledge and thinking skill are… They’re like a tool, and you can direct that tool to any end, you can direct it to the end of figuring out what’s really true if you’re motivated to do that, or you can direct it to the end of coming up with air-tight defenses of whatever position you want to believe if you’re motivated to do that. And so just having the tools themselves… I mean, it sounds almost obvious now that I’m saying it in hindsight, and I guess it’s kind of obvious if you look at an internet forum where there’s all these people on the forum who are well-armed with a list of cognitive biases and logical fallacies, but all they do with that list is point out how other people’s reasoning is fallacious, and you never actually see them question anything that they believe or turn those tools on their own reasoning. I guess maybe it’s kind of obvious, but I hadn’t been fully aware of it.

0:30:14.0 SC: I don’t think it’s at all obvious, to be honest, I think you should give yourself more credit there. And in fact, the very first episode of the Mindscape podcast was with Carol Tavris, a social psychologist who’s done wonderful work on cognitive dissonance and how we rationalize all the mistakes that we make. And I asked her a little bit after diagnosing all these issues, what do we do about them, and she wasn’t that programmatic about how to fix all of these problems that were in, she seemed to think it was part of the human condition and we could sort of valiantly fight against it, but didn’t have anything specific, really, as a remedy.

0:30:51.3 SC: But here you are, so now we’ve finally, two-and-a-half years later, I think, gotten to what the answers are here, maybe three years later, oh, my goodness, yeah, we’re approaching the three-year anniversary of when I started doing this podcast.

0:31:03.4 JG: Oh, congratulations.

0:31:03.5 SC: We’re not quite there yet, but we’re getting there. And you approach this problem by way of a metaphor, a military metaphor. You want to explain what that is?

0:31:13.3 JG: Yes, so I call it soldier mindset, and that’s my term for, broadly put, the motivation to defend your beliefs or what you want to be true against any evidence or argument that might threaten those beliefs. And the metaphor comes from the fact… It comes from the way we talk about reasoning and beliefs and evidence, so the language is just… It’s strikingly militaristic when you actually look at it. So when we talk about beliefs, we talk about supporting our beliefs or buttressing our position the way you would fortify a fortress, and then we talk about poking holes in the other side’s case or shooting down an argument, as if, you know, yeah, we’re defending our fort against attack.

0:32:05.5 JG: And then when we talk about changing our minds, the language is almost… It’s like we’re admitting defeat. We’ve lost essentially. So just the idea of conceding a point, the word there is to cede, as in ceding territory, like in a battle, and then admitting you are wrong or admitting that someone has a point, to admit is like to let someone into the gates or into your fortress, so it’s all about weakness and defeat. And so I call it soldier mindset, and it’s just a term for what we’ve been talking about, motivated reasoning, also known under other names and in other facets as rationalizing or wishful thinking or denial or, to some extent, confirmation bias.

0:32:52.7 JG: So I wrote a book, not about the existence of soldier mindset, ’cause other people have covered that admirably already, but about an alternative to soldier mindset.

0:33:00.0 SC: Which you’re going to call? So you’re on a role, so keep going.

0:33:01.4 JG: Which I call the scout mindset, which is the title of my book, The Scout Mindset. And that metaphor is that, unlike the soldier whose role is to attack and to defend, the scout’s role is to go out and explore and see what’s really there, and form as accurate a map of the landscape or of the situation as possible. And the scouts may have preferences, like you may hope to learn that there’s a conveniently placed bridge across the river where you need to cross, but above all, you want to know what’s actually there, you don’t want to draw a bridge on your map where there actually isn’t a bridge in reality. So the scout mindset is, then, my term for the motivation to see things as they are and not as you wish they were.

0:33:48.7 SC: I actually really like this metaphor. Well, one thing that immediately came to mind is, Why do we need a metaphor for just trying to be as rational as possible, couldn’t we just put the case for it as cold-bloodedly rationally as possible. Maybe that’s okay, I’m not going to give you a hard time about that. But the metaphor, even though I’m totally on your side in terms of having the scout mindset, and it’s a good thing, it actually suggests reasons why the soldier mindset, the sort of sticking by your guns mindset, might in some circumstances be the right thing, right? If we take the metaphor too far, I mean, soldiers, if you’re in a battle, they serve a purpose. I probably want more soldiers than I want scouts, right, and part of being a soldier is… And it’s the downside of being a soldier, but it’s the necessary thing is your job is not to think about whether the battle is just or not, right? Your job is not to think about the geopolitical strategy, your job is to be on the ground or in the air or whatever, and try to win your momentary little battle.

0:34:55.2 SC: And if everyone was a political theorist and a just war theorist on the battlefield on one side, I don’t think that side would have a very good chance of winning. They would be hesitant. I always think in terms of rather than battlefield metaphors, I think in basketball metaphors usually, so there are people with scores mindsets, when the ball is in their hand, they’re looking at the basket and they’re going to shoot one way or the other. There are other people who are facilitators and they’re not even trying to score, they’re trying to set people up to score, and both of those mindsets simplify their jobs in some sense. So is there some self-undermining going on in this metaphorical choice, does it suggest that maybe it’s too much work for everyone to have this scout mindset?

0:35:39.0 JG: So a couple of things. First of all, it’s definitely not a perfect metaphor. I don’t know if it would be possible to have a perfect metaphor. I considered other metaphors, like I thought about the judge and the lawyer, for example, which some other people have talked about, but the problem with that was that the judge is very passive, the judge is not going out to learn about the world and get new information, they’re just considering the information that’s put in front of them, so that… That’s kind of not ideal.

0:36:06.2 JG: So yeah, I agree, there’s this kind of, potentially this connotation that in the real world, we actually need soldiers, at least until we’ve somehow achieved world peace, so that… Yeah, you’re right about that. And that’s maybe an imperfection in the metaphor. But then to the substantive point of, well, don’t we need people who are not constantly thinking about things and second-guessing things and questioning, but instead just acting. This is maybe a subtle point, but the metaphor, the terms scout and soldier mindset are supposed to be referring only to how do you decide what to believe. So if you… Like picture an entrepreneur who’s, they’ve got this company they’ve created, and there’s all these all this uncertainty in starting a company, are we targeting the right market and should our product be improved more before we launch it, etcetera, there’s all these uncertainties that you could be constantly wondering about.

0:37:05.6 JG: And I don’t think that being a scout means constantly questioning everything you’re doing, because that would be untenable. Basically, you have to make the best decision you can in a limited time period, and then you have to act on that decision for some set period of time, either until a good point, when it makes sense to step back and now it’s our quarterly meeting, like what things should we change, or until some new information comes up where you’re like, oh, we’ve… Our sales numbers this quarter were surprisingly low, maybe now is a good point at which we should step back and revise what we’re doing.

0:37:45.0 JG: And so I definitely don’t want to define scout mindset as constantly questioning everything, I mean to define it as having a “map” in your head of reality or of what my business choices are and whether they’re justified, and having your map be uncertain in all the places where it should be uncertain, and then you can pull that map out when you’re making decisions about whether to change what you’re doing, whether to pivot, etcetera, but you don’t need to be constantly staring at the map and never moving. Does that make sense?

0:38:20.3 SC: No, it does make sense. I just want to keep pushing a little bit because… Once again, I’m totally mostly on your side here and I’m just trying to sort of…

0:38:27.6 JG: No, no, pushback is great, I love it.

0:38:31.7 SC: Pushback, right, exactly. Because one of the things that I find hardest about trying to be rational, especially in a social context, is that we do have finite resources, both of our cognition, our attention, our time, a million different ways, and so there are… Especially the one that I’m very familiar with is as a physicist online, there are a lot of crack pot physics things going on out there, and there will be people who just say like, you have a duty to look into this, because if it’s true, it’s going to be amazingly important. The EmDrive.

0:39:05.3 JG: Right. And if you don’t, then you’re close-minded.

0:39:07.5 SC: Yeah, exactly. The EmDrive… Do you know about the EmDrive?

0:39:11.6 JG: That sounds familiar, but I couldn’t explain it to you, do you want to…

0:39:14.6 SC: It’s a propellentless drive. Basically, it’s like they really believe the analog of getting in your car and making it move by pushing the steering wheel forward, so like they’re just shining light on one side of the spaceship from the interior and making it move and somehow conservation of momentum is violated, but they say that in a basement in Texas someone made it work, and I’m like, no, maybe someone did make it work, there is certainly a chance. There’s certainly a non-zero credence that something like that works, but I have all this background knowledge called like the laws of physics and things like that, and so this is beyond the pale to me, this is something I’m not going to pay attention to, and I think that’s a correct attitude. But figuring out exactly where to draw the line between what’s not worth paying attention to and what’s worth being open-minded about seems to be the tricky part.

0:40:06.5 JG: Yeah, no, that’s definitely… That is definitely the tricky part, yeah, and maybe you already understand or agree with this, but I don’t think there’s anything intellectually dishonest or anti-scout mindset in saying that could be true, but it’s like I have limited time and information and that’s below the threshold of worth paying attention to or worth investigating now. And to be honest, I think the people who are urging you, kind of guilt-tripping you into trying to pay attention to their fringe claim by exploiting your desire to be intellectually honest, I think that if they were… I don’t actually think that they would act that way about any random fringe claim, I don’t think anyone can realistically claim that every single fringe claim is worth everyone’s attention, that just doesn’t really make sense, so this feels more like an exploit to me, like an attempt to exploit your desires to be virtuous.

0:41:00.1 SC: Don’t worry, it doesn’t work, my desires to be virtuous are not that strong, I can easily… I can easily ignore it. But if I’m trying to be virtuous, it’s not that I’m going to start paying attention to them, but it’s that I cannot exactly articulate where I should draw the line between crazy ideas worth paying attention to and those that are not, because I can be very honest and say that some of my ideas are seen as crazy by other people, and so how do I make the sales pitch to them that, well, maybe they’re unlikely in your world view, but they’re not so crazy you should ignore them. That seems like a very hard meta rationality problem to me.

0:41:36.1 JG: Yeah, it is hard, and I don’t have a clear-cut answer to it. So sometimes I think in theory, the distinction that we want to be paying attention to, even if we can’t always perceive it perfectly, the distinction is between dismissing something because your prior beliefs indicate that it is very unlikely to be true versus dismissing something because you don’t want to accept it or you don’t want to take it seriously. So one is just like a… It’s just a cognitive… The judgment call is purely cognitive, and on the other hand, it’s motivational. And so sometimes what I’ll do is I will… Sometimes I will pay attention to whether, if there were another theory that I liked that was similarly implausible, would I feel motivated to check that out instead. Like I might, I don’t know, I might like the idea that… I’m trying to come up with something implausible but also desirable…

0:42:53.1 SC: Well, you know, are UFOs really aliens.

0:43:00.0 JG: Yeah…

0:43:00.5 SC: I would like it to be true.

0:43:00.5 JG: I don’t know what my [0:43:02.1] ____ are about that, actually.

0:43:02.2 SC: I would like it to be true. Okay, well, how about the EmDrive, I mean, how about…

0:43:04.1 JG: Okay, maybe like a clearer… Maybe like a better way to express this is just, if you find yourself saying that, like I don’t have… I can’t immediately find any flaws in that argument, but just this seems too low probability to me to look into, then that seems much more intellectually honest to me than what many people do, which is making an argument against the evidence that’s maybe not even very well-supported but they feel like they have to come up with an argument against it to dismiss it.

0:43:35.9 SC: No, that’s a very good point. So let me try to restate it to see if I understood it, that there are times when we’re not going to pay attention, give our resources over to seriously contemplating a certain proposition, and the reason why is just because it is so in conflict with our background beliefs that we’re going to judge it very, very unlikely, and it’s better to offer that as the honest reason why, rather than to sort of come up with some fairly plausible substantive reason that pretends to meet the argument on its own terms.

0:44:11.3 JG: Yes, that’s very well put. I don’t think it fully solves your… It doesn’t fully answer your question, but it’s one of the heuristics that I would use to try to be intellectually honest while still not evaluating every single claim. Maybe another heuristic is just to have at least some criteria for whether to investigate or take seriously a claim that are kind of outside view criteria unrelated to whether you personally want to reject it or accept it. So for example, if some smart and reasonable people whose judgment I trust in other ways think that this claim is at least promising or at least worth considering, then I think that, if I then still am reluctant to even consider the possibility that it might be true or that it might be worth some people investigating, then that seems a little suspicious to me.

0:45:10.2 JG: Yeah, or maybe that suggests another criteria, like if I don’t want anyone to investigate it, and if I get angry when anyone takes it seriously, then that kind of gives the lie to my claim that it’s just not worth my time, because I do want lots of people investigating lots of fringe ideas. And in theory, you should be in favor of that even if you don’t personally have time to look into everything yourself.

0:45:31.4 SC: Well, yeah, no, but this comes exactly into where the rubber hits the road kind of question for people in academia, for example, where you… It’s great to imagine everyone pursuing every crazy idea, but in the real world, you have not just limited resources in terms of your own brain, but you’re hiring people and you’re giving out grant money and things like that, so how do you allocate grant money or jobs to people who are pursuing ideas you think are probably wrong, right? That’s a very difficult thing because, well, you think it’s probably wrong, but it’s a chance that it’s right. I think it’s very hard. It certainly is hard in practice to find people who will say, well, I don’t like this idea, but we should hire them anyway, because who knows what I think, right?

0:46:17.0 JG: Yeah, these are all very hard questions and I would… I guess I’d be shocked if there was an easy clear-cut answer to them, just because in practice, it so often depends on the particular logic of the case in front of you, where sometimes… Like if you can see a very basic way in which the theory contradicts extremely well-established physics, and the person either doesn’t seem to be aware of that fact or their defenses of it seem ridiculous to you, then I think it’s pretty reasonable to conclude that they’re not… That there’s not some hidden brilliance there that you’re missing. But in general, I think we should like… In general, I suspect that the bias is towards being too unsympathetic to weird or fringe ideas, just because I think that’s human nature that we’re… We tend to not like the unpopular maverick who has an idea that tells us we’re wrong or that questions the consensus.

0:47:27.4 JG: I think that our bias is towards us being less open to that than we otherwise should, and so even though I don’t want to say everyone should get all the funding and all the attention, I want to err more on the side of being open as a community, as a scientific community, I want to err more on the side of letting people investigate things that most people think are wrong.

0:47:49.0 SC: Well, I think you said something there that I think is actually very true and very important, I don’t want to let it go by too quickly, which is that one of the techniques we can use when we’re judging these ideas that seem to be unlikely, it seems to me, is not just to think about the idea, but to think about this person who is promoting the idea, right. And to ask ourselves like, when we’re listening to them, do they seem like a fanatical demagogue about this, like they’re just so focused on it, they can’t see the opposite, or do they start by saying, well, I can see why you would think this is crazy, and let me explain to you why it’s not. That’s always much more persuasive to me.

0:48:28.0 JG: I think so too, yeah. And maybe there’s going to be some exceptions. People who are actually right, but just bad at seeing the criticisms, but I think as a general rule, that matches my experience, that people who understand and can articulate the reasons why their theory seems implausible or why other people don’t already agree with them, if they can understand and articulate those things, and still say, I think people are still missing something, that is a good sign to me. I would put that on my list of things that would make me more willing to take seriously an idea that on the face of it seems wrong.

0:49:08.9 SC: Well, and speaking of which, I mean, you have written a book, and I encourage everyone to buy the book, it’s very helpful, I think I blurbed it, in fact, but… And I don’t want to give away all the book, therefore, here on the podcast, but maybe we can give a flavor for it by giving some very explicit helpful hints or strategies or whatever it is for being more scouty.

0:49:31.0 JG: Yes, so tips for being a better scout?

0:49:34.6 SC: I think so. Whatever you think is the best way of doing it. What happens when we put this idea into action.

0:49:43.2 JG: Sure. There are… So I designed scout mindset in this kind of abstract way of a motivation to see what’s really there, but we can make that more concrete with some examples. So a scout mindset could look like actively seeking out critical feedback about your beliefs or about how you’re doing as a boss or as an employee or as a partner, and actually trying to think about the feedback when you get it instead of immediately jumping to your own defense or the defense of your ideas. Scout mindset could look like, as we were talking about a minute ago, really trying to understand the views of people who disagree with you and understand them well enough that you could explain them and explain them compellingly enough that someone else couldn’t tell that you actually don’t believe that yourself, so you’re not inadvertently straw-manning this idea as you’re explaining it.

0:50:38.8 SC: You do have a hilarious example in the book about a liberal writer who said that conservatives should trust that they completely understood where they were coming from, and then did just a terrible, terrible job of giving the conservative point of view.

0:50:52.4 JG: I was cringing as I read it. I admired this person, this liberal blogger who will go unnamed, I appreciated that they were, at least in theory, trying to understand conservative views, but the attempt was just so bad, it was like a cartoon villain caricature of conservatives where they were like, you know, I get… I understand you, conservatives, you think that the best thing in life is for rich people to become richer, and that’s just like… That’s what’s really valuable. And I understand you think that poor people are all just terrible people and should die. I’m caricaturing a little bit, only a little bit.

0:51:32.0 JG: Right, so that’s like one facet of scout mindset I would say, being able to accurately understand and portray the views of people who disagree with you. Kind of a corollary of that is just being able to point to some critics you have who you think are reasonable, even if you don’t agree with them, and so those could be critics in the sense of people who disagree with your political views, your scientific beliefs, but they could also be critics of your lifestyle choices or your… The industry you work in, like if you work in tech or you work in the military, and I think what people typically do, and what I certainly am tempted to do, is to focus on the unreasonable critics, because everyone has unreasonable critics, there’s always people on the other side who just are terrible.

0:52:18.8 SC: Oh, my goodness, yes.

0:52:21.4 JG: Yeah, so there’s always critics who are either just completely ignorant or arguing in bad faith, and so yes, those exist, but if you want to be a scout, you should also be pushing past that to look for the critics who may not be quite so unreasonable, and even if they’re wrong, maybe they’re wrong in subtle ways where you can understand how a intelligent, thoughtful person could hold that view.

0:52:41.6 SC: Let me… So let me just interrupt you here.

0:52:45.5 JG: I just thought it was like suspicious… I think we should find it suspicious if you can’t find any good critics of your views or your choices, that’s not a good sign.

0:52:51.3 SC: That’s exactly, I like that a lot, and I think that it’s… I want to relate the last two things you said, because they seem slightly related versions of a similar idea. Like one is the idea that you can accurately model your opponent’s point of view, it’s like you can run a little virtual machine in your head that mimics their point of view, and then the other is that you can find a real person who has that point of view that you respect, and I think that both of those are great. All in favor. In the spirit of pushing back, I have people who I respect enormously, very, very smart and I disagree with, and so the example that came to mind when I was thinking about this podcast is actually David Chalmers, because I’m finishing up an article about consciousness.

0:53:40.1 SC: And David Chalmers, a former Mindscape guest, is clearly super-duper rational, super-duper smart, very open-minded, very nice guy, like everyone loves David Chalmers. And I completely disagree with him about consciousness, and I don’t know what he thinks about me in terms of being rational and smart, but he completely disagrees with me about consciousness, and we’ve talked about it and we’ve tried to be rational, and I don’t think that either one of us has changed our minds even a little bit. Is there some worry that the fact that there are such rational people who disagree with us points to the existence of these disagreements being something that goes beyond rationality? Is it a mode of communication failure or just such big differences in priors that no evidence can overcome them? Or how should we think about that?

0:54:30.9 JG: So this is another thing on which my opinion has sort of evolved over the last 10 years. I think it now seems to be much more likely and much more common that two people, even if they’re being perfect scouts, that two people could earnestly try to resolve their disagreement over some factual issue or some, at least partly empirical issue, and fail to reach agreement on what the correct answer is, even if they’re both really genuinely trying and both smart. I think that seemed more implausible to me 10 years ago than it does now, and a lot of the problem is just as you kind of alluded to, our prior beliefs, really do, even if you’re not suffering from motivated reasoning at all, which all of us are to a large extent, but even in the perfect world where we weren’t, your prior beliefs color how you interpret the evidence, and so there are just these tangles of where it’s kind of hard to escape this trap where you and I are both… We can’t resolve our disagreement because whatever evidence we look at, we can’t agree on how to even, what direction that evidence should update us, because the way we interpret the evidence is colored by our prior beliefs, and so that does happen and it’s philosophically interesting and frustrating.

0:56:03.5 JG: I’m just kind of… I do still think that trying to strip away the motivation component from it at least improves the situation, and I have made progress on intellectual disagreements that I wouldn’t have made if I hadn’t been trying to remove or account for the motivational aspect, but I still have to accept that that doesn’t mean stripping away the motivational component of reasoning isn’t going to mean that everyone is going to be able to agree on everything because of this issue.

0:56:35.5 SC: No, yeah, absolutely not. And I wonder how much of it is just due to the fact that beliefs don’t appear in isolation, but they’re part of a network of beliefs. In my book The Big Picture, I called them planets of belief, where you have a bunch of beliefs that sort of stick together under some mutual gravitational fields. And if you think that you’re just trying to change someone’s mind about one thing, you’re actually trying to cause this enormous phase transition in a whole set of beliefs and it’s hard. Maybe it will happen, but maybe it’ll take 10 years to achieve, and maybe just will never happen.

0:57:07.2 JG: So well put, I wish I had thought of that metaphor about phase transition, but… Yeah, so often what we realize, or what I’ve realized after the fact is that there were people’s attempts to change my mind about one node of this graph were doomed because there were other nodes that kind of in the background that I wasn’t even aware of that needed to change first or needed to change also in order for that first node to change. So one example that came up as I was doing research for my book was, it was an example of this pastor named Joshua Harris, who he wrote this… He’s most famous for writing this book, that if you grew up as a Christian teenager in the ’90s or early 2000, you probably had in your bookshelves, it’s called I Kissed Dating Goodbye.

0:57:54.2 JG: And he wrote it when he was just, I think, 21 years old, and it’s a book all about how you shouldn’t date or certainly not have sex before marriage, but even engage in any kind of romantic relationship before marriage, because you have to remain pure for your future spouse. And the book got a lot of… It became famous, but it also got a lot of pushback, and a lot of people came forward and said, Joshua Harris, your book completely screwed me up, and now… Either people said, well, I got into a romantic relationship that was, or I got married and it was terrible because I had no past experience to help me make a good choice, or they said even now that I’m married, even though I remained a virgin before marriage, now, every time I’m intimate with my wife, I feel like I’m doing something terrible because I’ve adopted this view of sex as like dirty and something that corrupts you.

0:58:51.4 JG: And so anyway, all of these stories, people started sharing all these stories about how they felt this book had been bad for them, and Joshua Harris, who I think is actually an unusually thoughtful and well-meaning person, he heard the stories and he basically just discounted them for years, because he just thought these are exceptions or these are, I don’t know, haters, which is true, very famous book gets haters, and so he was able to dismiss them. And then the moment when he first actually took seriously the possibility that his book might have been harming people was after a different experience he had, where someone in the congregation where he was a pastor… Basically, there was a sex abuse scandal and several members of the congregation, it came out that they had been sexually abused by other members of the congregation.

0:59:45.2 JG: And he, Joshua Harris had… He was not involved in the abuse, but he had known about it, and he had not encouraged… I think he had encouraged the victims to settle it internally and not involve the police. And so then he realized after the fact, oh, that was… I was wrong, I should not have made that decision. I should have let them, encouraged them to talk to the police, and he felt terrible about it, and then he had the realization, oh, so you can actually harm people even though you’re really well-intentioned and trying to do good. And maybe that sounds obvious, but until you… I think until you really live through it, you don’t fully appreciate that that can be true. And then once he had that realization, then he thought, oh, maybe my book actually harmed people, because he had… All the time when people were complaining to him about the book, he had this kind of background belief that hadn’t budged, which is that if you believe you’re well-intentioned then you’re not actually causing harm, and now that that node had shifted, he could actually shift the node that was about the book in particular. Does that make sense? Sorry, that was so long, but I just think it’s so interesting.

1:00:48.9 SC: It makes perfect sense. I have a favorite metaphor for this also that I actually got from Jennifer, my wife, who did research on changing people’s minds for a little while, and it’s the metaphor of the plateau. And you’re going to laugh, but it comes from barbecue. It comes from cooking pork shoulder, low and slow, so you have your barbecue, your grill or whatever, and you keep it at a very low temperature, 225 degrees, and you put the pork shoulder in there for hours, for seven hours it takes to cook this, and it’s all very scientific, you put a thermometer in the pork and you watch the temperature go up and it plateaus, it’ll go up, the temperature of the pork, linearly, and then at around 150 degrees, it just stops, it just stops getting hotter, even though it’s surrounded by 225 degrees ambient temperature, right, so what’s going on there?

1:01:43.6 SC: Well, the answer is scientifically what’s going on is that it is still changing, but it’s giving off steam, so it’s sort of evaporating, it’s losing its heat, or rather it’s exchanging energy with its environment, but by giving off steam rather than by increasing in temperature. And if you just let it stay there for a while, usually the right thing to do in practice is to wrap it in foil so it stops giving off steam, but eventually if you just let it in there, it would start increasing in temperature again.

1:02:14.1 SC: And so I like this metaphor because you can try to change the minds of a person or a group of people and not be seen, not be apparently getting anywhere, nothing is happening, but there is some change going on beneath the surface that will eventually lead to the change that you’re looking for. But maybe you don’t know. But the point is, there’s no direct relationship between the internal state of what you’re talking into and the external way that they’re presenting themselves, right?

1:02:43.4 JG: Right, right. Or even internally, it probably… The way it often feels to me when someone is trying to change my mind is that my mind is not changing, my mind’s not changing, my mind’s not changing, and then, and then I guess often there’s kind of a state of confusion where the evidence that conflicts with my view has accumulated enough that it’s passed the threshold of like, okay, well, actually maybe I’m confused about what’s true here. And then either I stay confused, which is often the outcome, or my confusion resolves into a new and changed view of what’s happening. But those are like two-phase shifts, I guess, as opposed to a steady linear change in my view.

1:03:30.6 SC: But my hypothesis, which may be completely wrong, is that there are hidden variables in the quantum mechanical language, in other words, there are things going on that are below your own conscious perception that make you closer or further away from that phase transition.

1:03:46.3 JG: Yeah, and do you think those hidden variables are… Are they kind of like the nodes we were talking about, where there’s unconscious beliefs that are kind of being changed, or are they…

1:03:57.0 SC: Yeah, I think… I mean, I think we know that neurons, right, they get a little signal and the signal they get is kind of analog, you get a couple of beeps here and there, but the output is digital, like if the frequency of getting the inputs goes above a certain threshold, then the neuron will fire. So I think that what we don’t know is how close all of the different nodes in our network are to flipping over and doing that phase transition, like our external profession is still perfect confidence, and some people really are perfectly confident and other people are one straw away from the camel’s back being broken.

1:04:33.3 JG: Right, right. That’s very well put.

1:04:36.2 SC: I’ve no idea if it’s true, but this is a theory that I have. Good. One of my favorite chapters, I just want to hit very quickly, we’re approaching the end here, but I do want to get some more of these helpful hints, if we want to call them, real world strategies on the table. I think my favorite chapter in the book was about wearing one’s identity lightly. Maybe you could say more about that.

1:05:01.9 JG: Yeah, so the last section of the book, the last three chapters, are about identity and how beliefs can become part of our identity, in the same way that people are very familiar with how political beliefs are part of your identity for many people, or religious beliefs. In the sense that when someone disagrees with them, it feels like a personal attack, it feels like someone is stomping on your country’s flag or something like that, and that’s the subjective experience of someone disagreeing with one of those beliefs that are part of your identity. And I think politics and religion are just the tip of the iceberg, there’s so many beliefs that can become part of people’s identities. I lived in San Francisco for many years, and so beliefs about which programming language is best can easily become part of people’s identities and they can argue very passionately and emotionally about that.

1:05:53.7 JG: And so a common piece of advice about this phenomenon comes from Paul Graham, who’s a technology investor and an essayist, he’s written a lot of great essays, and he has this one popular essay called Keep Your Identity Small, where he says, the more things that become part of our identities, the harder it is for us to think clearly, and so all else equal, you should let as few things into your identity as possible. And I think… I love this advice, and I’ve gotten a lot out of it. I just prefer a slight variant on it, which I call holding your identity lightly instead of keeping your identity small, and the variant, the thing I’m changing here, is that I think a lot of people, certainly me after reading his essay, tried to just avoid identifying as anything, like I would say… There’s a movement that I am a big fan of and I’ve worked with called Effective Altruism, which is about basically trying to apply rationality to the project of doing as much good as possible in the world.

1:06:57.8 JG: And so in keeping my identity small, I would be refusing to identify as an effective altruist in case it distorted my ability to think clearly. So if anyone asked, are you an EA, I’d be like, no, no, no, I’m not an EA, I just, I’m a big fan of them, and I work with them and I like support their views and agree with them. And so basically it just started… It just started to feel a little bit awkward like verbally to try to explain what I mean, and also I do think if you support a cause or a position, then identifying with it publicly can help bring attention and credibility to it, and it can… You’re lending your credibility to the movement by identifying as part of the movement, and so I don’t want to encourage everyone to never join any movement or political party or cause, because then where would we be?

1:07:51.9 JG: And so really, I think the trick, and I don’t know how different this is from what Paul Graham actually meant, I’m just kind of disagreeing with the way that his advice is often implemented in practice by people. I think what you need to do is to be able to identify with causes and movements and so on that you value while still keeping some distance, emotional ideological distance, from those positions. So that entails, for example, always having a separation in your mind between like, this is what the Democratic Party believes, or this is what liberals believe, or Black Lives Matter, or whatever cause that you’re part of, and here’s what I believe, and maybe there’s a lot of overlap, but they’re still two separate things, and whatever differences there are, I want to notice them. And my support for these positions is contingent, in the sense that I will identify as a feminist up until the point when I think feminism is causing harm or is wrong.

1:08:49.3 JG: And so just always being aware that that, you’re supporting the belief to the extent that you actually think it’s true and helpful, you’re not… The support isn’t the most important thing, or I guess I said that wrong, but you shouldn’t be supporting it just because it’s your tribe, you should be supporting it to whatever extent you think it’s right.

1:09:11.0 SC: Right, yeah, I mean, you can agree, you could in principle agree entirely 100% with the positions of a certain group of people, but agreeing with their positions might not be the same as identifying as a member of the tribe.

1:09:24.3 JG: Yeah, I mean…

1:09:27.1 SC: It’s a tricky distinction.

1:09:27.1 JG: Yeah, yeah. It’s often internal. Like, I don’t know, one person I interviewed for the book said he used to think of himself… He used to strongly identify as a feminist, and what that meant in practice was that he would often feel compelled to jump into arguments online where someone was being wrong and argue with them to defend feminism. And then he made a conscious attempt to kind of hold that identity more lightly. And so now he says, if someone asks him, are you a feminist, he will probably say yes, because he has a lot of agreement with… That call themselves feminists, but internally, it feels very different. So if someone criticizes feminism, it no longer feels to him like his tribe is being criticized, like this group of ideas is being criticized, and maybe I agree or maybe I don’t, but it doesn’t feel like a personal attack on me. And if they’re right, I want to know if they’re right, and if they’re wrong, then I don’t necessarily have to jump in and angrily argue with them.

1:10:24.8 SC: I think this is a tricky issue for me personally, because I do think that… I mean, I know people who go so far as to just reject all isms, right, like I don’t like labels.

1:10:34.7 JG: That’s part of what I’m talking about.

1:10:36.2 SC: But I get it, but it’s just so informationally compact to give me the label on what you are.

1:10:43.1 JG: Yes, exactly, exactly, that’s what I was trying to say.

1:10:49.0 SC: So just at the pragmatic level, I think that labels are always going to be useful in some sense, but then… So I think that’s okay, but that’s sort of at a non-emotional response, and there is this much more emotional level where… So Tyler Cowen, so our mutual acquaintance who was also a previous podcast guest, he once blogged about some rules for… I don’t know whether it was being more rational or something like that, but one of his principles is that as soon as people use the words Democrat or Republican, their IQ dips by 20 points or something like that. And I get where that’s coming from, and probably as an empirical statement, it’s quite justified, but it seems like a recipe for a kind of quietism, like don’t get too invested in real world attempts to change things for the better, because that counts as a kind of tribalism. Is there some tension there between not ruining our own rationality by identifying too strongly and not committing ourselves to making the world a better place enough?

1:11:54.6 JG: So I think what I’m trying to say is that your end goal, that the thing you actually care about and are trying to promote in the world, is… Like whatever the thing is you actually care about, like making the world a better place, and so whatever tribes you identify with are hopefully the tribes that you think are going to help with that goal, but you shouldn’t be losing sight of what the primary goal actually is, helping make the world a better place. And so you should be on the look out for ways in which… Like cheering for the Democrats or supporting whatever the latest Democrat cause is, you should be on the look out for ways in which that those things don’t actually help with the goal that supposedly is your main goal, like helping the world be a better place.

1:12:39.8 JG: And so I’m definitely not saying you shouldn’t be identifying with a party or a cause, but there are a lot of ways in which doing the thing that feels really satisfying, if you strongly identify as a Democrat or an effective altruist or whatever cause, doing the things that feel really good for your identity are not necessarily the things that actually help make the world a better place or further whatever your ultimate goal is. So I have this… The way I think about it is in this graph there’s two axes or one axis is how strongly does this action validate your identity, and the other axis is how much impact does your action actually make on the world?

1:13:25.6 JG: And so, some actions feel really good for your identity but don’t help the world, like arguing with strangers on the internet or just, I don’t know, putting a bumper sticker on your car or something that’s not really all that helpful. Or some things are actively anti-helpful or unhelpful, like attacking groups that are really pretty similar to you, but have 10% disagreement with you on what to do, and it can often feel very tempting to attack those groups because they all disagree with you, but not quite, and so that leads to a lot of infighting. And so I think this is an important benefit of holding your identity lightly is that it allows you to stay away from those actions that are so tempting for identity purposes, and instead try to find the actions that are actually going to help make an impact in the world, which might be things like…

1:14:16.3 JG: One group that I interviewed for the book is an animal welfare group called the Humane League, and they pivoted from their original focus, which was kind of like splashy demonstrations outside the houses of scientists who were experimenting on animals, that was what they were originally focusing on years ago, and then when their new director came in, David Coman-Hidy, he pivoted the organization to focus instead on farm animal treatment, in part because just running the numbers, there’s way more farm animals in the world than lab animals, and so if what you care about is improving the lives of animals, you can have a much bigger impact by focusing on the latter. And the specific strategy that they adopted, which has turned out to be really valuable, is negotiating with large agricultural companies to treat their animals better, so for example, to agree to stop throwing male chicks into grinders after they’re born, which was like typical practice in agricultural companies because mail chicks can’t produce eggs, but billions of chickens were dying painfully due to this policy.

1:15:19.5 JG: And so anyway, the Humane League has won a bunch of impressive concessions from these large agricultural producers, but it’s not… That strategy is not very satisfying if what you’re thinking is, I want to fight for my tribe of the animal rights activists against the evil companies who are standing in our way, because negotiating with evil companies is like exactly the opposite of what is satisfying if what you’re motivated by is fighting for your tribe. And in fact, lots of other animal rights activists resent them for this, for like negotiating with the enemy, but in fact, what they’re doing is really impactful for their goal.

1:15:55.7 SC: I think that’s super important, actually, and maybe I can just say the same thing over again, because I don’t think I can say it better than what you just did, but… If what you want is to make the world a better place, you’re going to have to work with people you don’t agree with or even like, right?

1:16:09.7 JG: Right, or at least resist the temptation to get stuck in unproductive fights that aren’t actually worth fighting.

1:16:18.1 SC: Yeah, the purity gatekeeping is one example of not necessarily making the world a better place, but at the same time, so just ’cause I just need to be contrarian here, I do think that… I try, especially… And maybe this is just because I’m old and patronizing, but I try with young people who are sort of maximally identifying with some tribe, like they’ve discovered a cause, and it could be one that I agree with or disagree with, but they’re really passionate about it, and I have this feeling like I should encourage them to be passionate about their little tribal identity that they’ve discovered, ’cause they’re sincerely trying to make the world a better place in some way, and they’re just not… The energy to make the world a better place often comes from these irrational things, is what I guess I’m trying to say, and maybe that’s not right, but it’s George Bernard Shaw, right? No reasonable man, all reasonable men know that they can’t change the world, therefore all progress comes from unreasonable men. And maybe that’s not right…

1:17:23.6 JG: I hate that the definition of reasonable [1:17:25.3] ____.

1:17:25.4 SC: Yeah, I understand, but I do think that there are a lot of people who… It goes back to the soldier versus scout thing. I think there’s a lot of young activists on the right and the left who are soldiers, who are committed to a cause and are passionate about it, and even if I don’t think it’s the best way to live your life, I think that maybe it’s a sensible phase to go through, or at least an excusable phase to go through.

1:17:47.5 JG: Yeah, well, again, I think it’s important to separate out being extremely passionate and driven to solve some problem in the world from being extremely passionate and driven to defend your tribe and to really prioritize the former over the latter. And I’m not going to claim that people who are soldiers and who see the world in black and white and aren’t trying to be objective, I’m not going to claim those people never change the world, ’cause clearly many of them have. So it’s not like that strategy never works. And given that that’s kind of the natural state of affairs for humans, I think it makes sense that we see lots of examples of that working.

1:18:30.1 JG: I’m just trying to claim that I think it works better if you could be passionate and driven about impact, than be passionate and driven about defending your ideology or your tribe.

1:18:40.8 SC: I think it would be better. I’m on your side, really.

1:18:43.8 JG: It’s hard to prove that, ’cause we’d have to do a study, and I don’t know how we would even begin to do such a study. But at the very least, I wanted to show some… At the very least, I wanted to combat the claim that you can’t be an effective activist, that you can’t change the world while being a scout, because there are some very striking examples of scouts who have changed the world, and I suspect that’s a more effective strategy, even though I can’t prove that.

1:19:06.6 SC: No, I think it’s a perfectly rational claim that you’re making, whether or not the evidence is for, we’ll gather more evidence. So maybe to end up, let’s just ask the meta question, which you I think explicitly address in your book, is it rational to be rational, where the word rational is being used in just ever so slightly different senses there, like is it… If I have some self-interest in being happy or fulfilled or whatever, is being perfectly rational about my view of the world, the best strategy to get there?

1:19:35.7 JG: This is a great question, and one that… It’s one of the reasons it took me so long to write this book, ’cause then I spent a lot of time thinking about potential challenges to the thesis or ways in which, like potential downsides to scout mindset. And so to elaborate on the question that you asked, there’s these two different senses of the word rational. We haven’t really made this distinction so far, even though we’ve been talking about being rational throughout this whole conversation. But one of the senses is epistemic rationality, which is about… It’s basically the sense in which I’ve been using the word throughout this conversation, like reasoning as accurately as possible, reasoning in a way that makes your beliefs more accurate over time.

1:20:18.2 JG: Then the other sense of the word is instrumental rationality, which is about whether the decisions you make are effectively achieving your goals or not. And so the relationship between epistemic and instrumental rationality is complicated, and many people argue that in fact, the instrumentally rational thing to do is often not epistemic rationality, like often if you have, if you’ve self-deceived in some way that will make you happier or that will make you better at achieving your goals of, say, changing the world or of succeeding at some hard thing, like making your company succeed, and so you need to… You should be trying to be a soldier in order to achieve your goals.

1:21:07.0 JG: And I think this… So I think this is wrong in a number of interesting ways. But the most important point that I want to make is just that in all of the specific cases in which I’ve seen people say, oh, yeah, you have to be self-deceived, you have to be epistemically irrational in order to, fill in the blank, in order to be happy or in order to be motivated to do something hard, or in order to change the world. In all of those cases, I have seen other ways you can achieve those goals without having to deceive yourself. And so I think that a large thing that’s going on here is just that people are… They’re neglecting the possibility of finding other strategies to get the things they want, and they just assume, they’re kind of giving up too soon, and they’re assuming, well, you’ve got to deceive yourself if you want to be happy or motivated.

1:21:58.1 JG: And so one thing that I spent a lot of time on in the book is pointing out that, oh, you don’t like… Let’s actually take an example from Carol Tavris’ book, ’cause you mentioned her earlier in the conversation, and towards the end of that book, Mistakes Were Made, But Not By Me, which is all about self-justifying behavior, she and her co-author kind of have this fatalistic attitude of, well, we have to self-justify because if we didn’t convince ourselves that we had made all the right choices in the past, then we would be consumed by regret and guilt and depression. And I read that and I was like, are those our only two options? Do we really have to choose between convincing ourself against the evidence that we’ve made all the right choices in the past versus being depressed and miserable? Surely there is a third option here, which is to find a way to be okay with the fact that we’ve made mistakes in the past and still be happy despite that.

1:23:02.4 JG: And I think this third way exists in basically all of the cases that people have presented to me, where they say you have this choice between being happy and motivated or being rational. I think there’s always a way to make… At the very least, make peace with your situation, that doesn’t require you to tell yourself false things.

1:23:22.1 SC: I think that a place where this comes up in my own experience with scientists is, I’ve known scientists, really super duper good scientists, who will explicitly say that the best strategy for, let’s say a theoretical physicist, is to be their own idea’s biggest advocates, because let everyone put forward the best possible view of their own perspectives and the free market will sort everything out. Whereas I tend to think that good theoretical physicists should be their own idea’s because critics, so they put forward like, here’s all the reasons why this could be wrong, but I believe it anyway, as we were discussing before. And I suspect that the ones who say, well, I should be my own theory’s biggest advocates are sort of cheating in the way that you just pinpointed, like they just kind of enjoy being their own theory’s biggest advocates, rather than thinking that’s the most rational thing to do.

1:24:17.0 JG: Well, it can be a good strategy if your goal is not to come up with a true theory and have that theory be accepted. If your goal is just to get a lot of attention or to get… I don’t know, prestige or…

1:24:34.2 SC: But I think that’s not true. I think that… Let’s give them the credit that they’re actually trying to find the truth, ’cause I think that is true. These are people who will change their minds, like just because they’re their own theory’s biggest advocates today, doesn’t mean they’re at all consistent with what they were saying last year.

1:24:47.5 JG: Right, right. Okay, so really this question is just, should we want… Setting aside one’s own selfish goals for attention or prestige or whatever, should we want other people to be their idea’s biggest advocates, is that how we want science to work. And in theory, that could work if everyone was playing the role of an ideal jury and they were hearing the best possible arguments on both sides for the ideas, but in practice that doesn’t actually seem to be the way it works. It seems like very few people are taking the time to delve into the nitty-gritty of the arguments for and against the theories. I mean, I’m speaking mostly about social science here, ’cause I’m more familiar with that than with physics, so maybe it’s a little different in physics, but…

1:25:43.2 SC: It’s not.

1:25:44.1 JG: In practice, like ideas get… They get attention and they get publications and they get TED Talks and articles, without all that much vetting of whether they’re really sound. And so if you have people who are just advocating for their ideas without worrying too much about whether those ideas are true, I don’t see that process really weeding out the true from the false ideas very well. Actually, maybe this is an interesting distinction between social science and physics, is that I think in physics it’s maybe clearer more often whether an idea is true or false, like you can do pretty conclusive tests oftentimes, and so if you have someone really pushing hard for an idea and then it fails in experiments, then maybe that’s just kind of clear-cut and the matter is settled. Whereas in social science, the experiments are so rarely clear-cut that ideas rise and fall much more based on the advocacy of their proponents than based on the facts from the world. And so the problem is more in social science.

1:26:40.2 SC: It strikes me that the rough life cycle is very, very similar in both cases, but the time scales are much quicker in physics, because like you say, there can be very definitive evidence, you go, alright, I guess I will move on. Or even a definitive theoretical argument, whereas in social science, not to mention the humanities, you can cling to your increasingly less likely position for a very long time without it going quite to zero.

1:27:04.8 JG: Right, yeah, yeah, I guess the time scales, maybe that’s what I’m pointing out, although, I don’t know… The quality of the evidence that we can get even in the best worlds or in the best possible cases in psychology, I don’t know, I feel like you can do a bunch of experiments and get whatever answer you want with a little statistical wizardry on either side, so I’m not even sure that long time scales would help the problem in social sciences. If you have everyone just being a soldier for their ideas, I’m not sure that that process would weed out the true ideas from false, even in long time scales.

1:27:45.0 SC: That’s right, so that speaks to the idea that individuals not just communities should be tough on their own ideas, which I think is…

1:27:51.3 JG: Ideally, yeah, I think so.

1:27:53.1 SC: I think it’s a better way to go, and just to sum it up, you do want to make a case for the emotional rewards of being like this, it’s not just an intellectual reward, it’s a satisfying way to live.

1:28:05.9 JG: I do, yeah. I mean, I think… I’m not going to deny that it can be comforting to tell yourself false but convenient things, but it is, I and many people I know find it very freeing to be able to follow the evidence where it leads without feeling like you have to believe such and such in order to be happy or in order to feel good about yourself or be motivated, that… I think it’s very liberating to not feel like you have to believe something in order to get what you want. And there’s lots of rewards of the joy of curiosity and figuring things out that I think comes from scout mindset, regardless of whether the conclusion is something that you had wanted to be true, there’s kind of a thrill of seeing the world more clearly than you had before.

1:29:03.8 JG: And so as always, I’m trying to be reasonable and nuanced, and I’m not going to claim that soldier mindset has no benefits, but I think the benefits of scout mindset are under-appreciated, and that if more people appreciated those benefits, then they would find scout mindset much more doable than they think it is.

1:29:18.5 SC: It sounds very good to me, but I promise that if contrary evidence comes along, I will change my mind about that, so…

1:29:23.9 JG: I would ask for nothing less from you, Sean.

1:29:26.6 SC: Julia Galef, thanks so much for being on the Mindscape podcast.

1:29:29.1 JG: Thank you, it was so much fun. Thanks so much.

[music][/accordion-item][/accordion]

7 thoughts on “143 | Julia Galef on Openness, Bias, and Rationality”

  1. Thank you for a wonderful and practical exploration of how we can get in the way of our sense of objectivity. Ms. Galef accepted challenging questions and provided thoughtful responses. I believe she exhibited the Scout mindset in her remarks and did not fall into a soldier’s mindset of defensiveness at all costs.

    However, throughout, I kept struggling with one complication in your discussion: prior beliefs. Ms. Galef addresses this matter briefly: “we all start out with prior beliefs and assumptions that just come from, whatever, they come from where we grew up, and the people we talk to, and the whole collection of all the sources we’ve read and the experiences we’ve had, and the entirety of our life experience has led to us having a world view.” I almost wish the conversation had developed in two parts, with part one exploring more fully from whence prior beliefs come. That’s a nice list, but deploying the word “whatever” tends to triggers further inquiry. As robust as the rest the discussion was about biases, irrationality …getting a one’s held beliefs seems to warrant more attention.

    PS: thanks for the transcripts! Great feature I wish all intelligent and probing podcasts offered.

  2. Loved this podcast very much, coming from computer science I have my own perspective on rationality, which has a lot to do with immediacy: the capacity to make the best choice in due time… philosophically that statement presupposes a strong faith in that your immediate experience is honest (or at least valuable), being bayesian requires that you believe your current experience it is telling you the truth, or at least presenting an unbiased sample of reality. it also means that on average, the people that interact with you in your present are striving for the same goal. My take on soldiers (contrary to scouts) is that they do not trust their immediate reality, they have already made a choice, and believe that their experience is adversarial, that is to say: the people they interact with have an agenda, and will try to convince them of their point of view by any means necessary, in that context your best strategy is to be defensive and dismissive of current arguments. I personally do not share this belief but find it understandable.

    I liked a lot Sean Carroll’s candid comments on his dialog with Chalmers, very honest and generous, thank you! my personal take on this argument: is that it’s true! I admire you both and there are irreconcilable differences in your conceptions of consciousness, to me that states that both are wrong (or right?), but it’s a clear manifestation that nobody has yet cracked, to everybody’s satisfaction, the (hard, elusive obvious?) problem of conciousness.

  3. Cleon Teunissen

    The start of the conversation reminded me of a bit of wit. There is this group of people who enjoy wordplay and thoughtplay, and they publish phrases. One of them is this admonition: “Be reasonable: look at it from my point of view.”

  4. I’ve always enjoyed her talks and Youtube channel, and it was a pleasure listening to your conversation. The two of you could have a competition for most depressing comment section. It’s really neck-and-neck.

  5. Just finished your podcast with Julia Galef. It was informative as always. I was struck by what you were saying about the disagreement you and David Chalmers have about consciousness and why two highly intelligent, highly educated rational people so totally disagree. – My experience has been that it happens most often around topics that actually have little in the way of evidence so that you are both forced to rely on your prior bias’s. And face it, there is no clear evidence on either side of that particular debate about the origins of consciousness.
    thanks again for my favorite podcast,

  6. SC- “I know people who go so far as to reject all ‘-isms.’ like ‘I don’t labels’ … I get it, but it’s just so informationally compact to give me the label on what you are.”

    But that is the problem. Once you slap on a label, it comes with the baggage of that label (that is what comes with ‘informationally compact.’) Once that happens, people are happy to fill in the gaps for what you believe, whether you believe them or not. They may do that anyways, but with certain labels, you are just aiding them in that process. Even for those trying to be sincere in a discussion may be distracted by a label and what they think it implies about your beliefs and attitudes.

  7. Just wanted to give props for countering your guests casual dismissal of deconstructing historically held ‘truths’ tied to assumptions of white supremacy as non-rational….. it was a casual dismissal, but I find this kind of disregard more and more common in conversations about stoicism in general these days (and have heard more of these casual dismissal with some other white guests). It was worth calling out and acknowledging that our conversations around these concepts usually assume a certain kind of life experience.

Comments are closed.

Scroll to Top