Why Apologists Need a Scout Mindset: Lessons to be Learned from Julia Galef

Have you ever wondered why some people are able to think about the world clearer, forming more balanced and nuanced views about controversial topics, than others? Have you ever pondered what thinking patterns are most conducive to good reasoning and well supported conclusions, and how one might avoid the pitfalls of confirmation bias and self-deception? In her book The Scout Mindset: Why Some People See Things Clearly and Others Don’t, Julia Galef (host of the podcast “Rationally Speaking” and co-founder of the Center for Applied Rationality) attempts to answer these questions. [1] In the first half of this essay, I shall summarize Galef’s insights; in the latter half, I shall discuss what lessons we as Christian scholars and apologists can glean from the book.

A Summary of The Scout Mindset

Galef distinguishes between what she dubs “the soldier mindset” and “the scout mindset.” According to Galef, the soldier mindset, also known as motivated reasoning, leads us to loyally defend the stronghold of our belief commitments against intellectual threats, come what may. This involves actively seeking out data that tends to confirm our beliefs, while rationalizing or ignoring contrary data that tends to disconfirm them. On the other hand, the scout mindset attempts to honestly determine how the world really is – as Galef defines it, the scout mindset is “the motivation to see things as they are, not as you wish they were,” (p. ix).

For the one in soldier mindset, argues Galef, reasoning is like defensive combat – “it’s as if we’re soldiers, defending our beliefs against threatening evidence,” (p. 7). For the soldier, to change one’s mind – to admit that one was wrong – is seen as surrender and failure, a sign of weakness. One’s allegiance is to one’s cherished beliefs rather than to the truth, even if those beliefs conflict with the balance of evidence. For the soldier, determining what to believe is done by asking oneself “Can I believe this?” or “Must I believe this?”, depending on one’s motives. For the one in scout mindset, by contrast, reasoning may be likened to mapmaking, and discovering that you are wrong about one or more of your beliefs simply means revising your map. Thus, scouts are more likely to seek out and carefully consider data that tends to undermine one’s own beliefs (thereby making one’s map a more accurate reflection of reality), deeming it more fruitful to pay close attention to those who disagree with their own opinions than to those whose thinking aligns with them.

The prevalence of soldier mindset in society today is aptly demonstrated by a sobering study, cited by Galef, in which participants were tested in regard to their “scientific intelligence” with a set of questions. [2] Questions were divided into four categories – basic facts; methods; quantitative reasoning; and cognitive reflection. Remarkably, when conservative republican and liberal democrat participants were also asked whether they affirmed the statement that there is “solid evidence” of recent global warming due “mostly” to “human activity such as burning fossil fuels,” there was a positive correlation between “scientific intelligence” and divergent opinion. That is to say, the higher one’s scientific intelligence, the more likely a liberal democrat was to affirm the statement and the more likely a conservative republican was to disagree with it. This is not the only study to reveal the tendency for more educated people to diverge in opinion on controversial topics. Another study surveyed people’s views on ideologically charged topics, including stem cell research, the Big Bang, human evolution, and climate change. [3] Their finding was that “Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues,” though they found “little evidence of political or religious polarization regarding nanotechnology and genetically modified foods.” Galef summarizes the implications of those studies: “This is a crucially important result, because being smart and being knowledgeable on a particular topic are two more things that give us a false sense of security in our own reasoning. A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money. But they won’t protect you from bias on ideologically charged questions,” (p. 48).

Though there is an element of scout and soldier in all of us, Galef argues, “some people, in some contexts, are better scouts than most,” being “more genuinely desirous of the truth, even if it’s not what they were hoping for, and less willing to accept bad arguments that happen to be convenient. They’re more motivated to go out, test their theories, and discover their mistakes. They’re more conscious of the possibility that their map of reality could be wrong, and more open to changing their mind,” (pp. 14-15). On the flip side of the coin, often “[w]e use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us – our ability to feel good about our lives and ourselves, our motivation to try hard things and stick with them, our ability to look good and persuade, and our acceptance in our communities,” (p. 26). For example, if we are being honest, how often do we, when considering a claim, “implicitly ask ourselves, ‘What kind of person would believe a claim like this, and is that how I want other people to see me?’” (p. 23). Such thinking fuels soldier mindset. In practice, we cannot eliminate soldier mindset from our reasoning processes entirely. After all, it is our default mentality. By nature, we like having our beliefs confirmed. But we can take intentional steps towards cultivating more of a scout mindset.

What are some of the key characteristics that distinguish scout from soldier mindset? In chapter four, Galef gives five features that define a scout. The first is the ability to tell other people when you realize that they were right. Galef caveats this quality by noting that “Technically, scout mindset only requires you to be able to acknowledge to yourself that you were wrong, not to other people. Still a willingness to say ‘I was wrong’ to someone else is a strong sign of a person who prizes the truth over their own ego.” The second quality is reacting well to criticism. Galef explains, “To gauge your comfort with criticism, it’s not enough just to ask yourself, ‘Am I open to criticism?’ Instead, examine your track record. Are there examples of criticism you’ve acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you?” (p. 52). The third quality that marks out a scout is the ability to prove oneself wrong. Galef asks, “Can you think of any examples in which you voluntarily proved yourself wrong? Perhaps you were about to voice an opinion online, but decided to search for counterarguments first, and ended up finding them compelling. Or perhaps at work you were advocating for a new strategy, but changed your mind after you ran the numbers more carefully and realized it wouldn’t be feasible,” (p. 54). The fourth feature of scout mindset is to avoid biasing one’s information. “For example,” writes Galef, “when you ask your friend to weigh in on a fight you had with your partner, do you describe the disagreement without revealing which side you were on, so as to avoid influencing your friend’s answer? When you launch a new project at work, do you decide ahead of time what will count as a success and what will count as a failure, so you’re not tempted to move the goalposts later?” (p. 56). The fifth feature that Galef lists is being able to recognize good critics. Galef comments, “It’s tempting to view your critics as mean-spirited, ill-informed, or unreasonable. And it’s likely that some of them are. But it’s unlikely that all of them are. Can you name people who are critical of your beliefs, profession, or even choices who you consider thoughtful, even if you believe they’re wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable (even if you don’t happen to know of specific people who hold those views)?” (p. 57). In summary, Galef notes, “Being able to name reasonable critics, being willing to say ‘The other side has a point this time,’ being willing to acknowledge when you were wrong – it’s things like these that distinguish people who actually care about truth from people who only think they do,” (p. 57).

Chapter 5 of the book offers five tests of bias in our reasoning. The first test is the double standard test, which essentially asks whether we apply the same standards to ourselves that we would apply to others. The second test is the outsider test, which attempts to determine how you would assess the same situation or data if you had no vested interest in the outcome. The third test is the conformity test, which attempts to discern the extent to which one’s opinion is in fact one’s own. Galef explains, “If I find myself agreeing with someone else’s viewpoint, I do a conformity test: Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them?” (p. 66). The fourth test is the selective skeptic test – “Imagine this evidence supported the other side. How credible would you find it then?” (p. 68). The final test is the status quo bias test – “Imagine your current situation was no longer the status quo. Would you then actively choose it? If not, that’s a sign that your preference for your situation is less about its particular merits and more about a preference for the status quo,” (p. 69).

Another thing that marks out a scout, according to Galef, is one’s attitude towards being wrong. Scouts, explains Galef, “revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs,” (p. 144). Further, “they view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing ‘I was wrong’ feel valuable, rather than just painful,” (p. 144). Galef even suggests that we should drop the whole “wrong confession” altogether and instead talk about “updating”. Galef explains, “An update is routine. Low-key. It’s the opposite of an overwrought confession of sin. An update makes something better or more current without implying that its previous form was a failure,” (p. 147). Galef points out that we should not think about changing our minds as a binary thing – rather, we should think of the world in “shades of grey”, and think about changing our mind in terms of an “incremental shift” (p. 140). Galef notes that thinking about revising one’s beliefs in this way makes “the experience of encountering evidence against one of your beliefs very different” since “each adjustment is comparatively low stakes” (p. 140). For example, “If you’re 80 percent sure that immigration is good for the economy, and a study comes out showing that immigration lowers wages, you can adjust your confidence in your belief down to 70 percent,” (p. 140).

Galef also points out that, when it comes to intentionally exposing ourselves to content representing the ‘other side’ of a debate in which we are interested, people tend to make the mistake of always ending up “listening to people who initiate disagreements with us, as well as the public figures and media outlets who are the most popular representatives of the other side,” (p. 170). However, as Galef explains, “Those are not very promising selection criteria. First of all, what kind of person is most likely to initiate a disagreement? A disagreeable person. (‘This article you shared on Facebook is complete bullshit – let me educate you…’) Second, what kind of people or media are likely to become popular representatives of an ideology? The ones who do things like cheering for their side and mocking or caricaturing the other side – i.e., you,” (pp. 170-171). Instead, Galef suggests, “To give yourself the best chance of learning from disagreement, you should be listening to people who make it easier to be open to their arguments, not harder. People you like or respect, even if you don’t agree with them. People with whom you have some common ground – intellectual premises, or a core value that you share – even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith,” (p. 171).

Lessons We Can Draw from The Scout Mindset

To what extent are we, as Christian scholars and apologists, cultivating a scout mindset? Too often debates between theists and atheists devolve into tribalism, an ‘us vs. them’ mentality, and a smug condescension towards those who disagree with us. But what if we saw those with whom we disagree not as enemies but as colleagues in our quest to attain a better map of reality? Our critics are those who are best placed to discover flaws in our own reasoning, which may be invisible to us. We ignore them at our peril. By listening carefully to our critics, we can construct a more nuanced, more robust worldview. And which critics of our faith are we seeking out to represent the dissenting view? Are we primarily engaging with popular but less-than-nuanced critics of Christianity, or are we actively seeking out the very best, most erudite and well-informed critics of our faith, even if less well known? Can we name some of our critics as honest and thoughtful? How are we positioning ourselves to be in the best place possible to find out we are wrong, if we are in fact wrong? If we are wrong about one or more of our beliefs, can we honestly say that we value truth enough to want to know? How do our answers to the foregoing questions bear on that latter question?

Perhaps at this juncture it should be clarified what exactly apologetics is, since there is regrettably much confusion surrounding this word, both inside and outside of the Christian community. It is commonly thought that the exercise of apologetics is contrary to open-ended inquiry where the conclusion is not stipulated a priori. However, this view is quite mistaken. While apologetics is not identical to open-ended inquiry, it is co-extensive with it in the sense that apologetics is what happens after the results of open-ended inquiry are in, and the time has come to publicize our interpretation of the data. Thus, though the term is seldom used in this context, every publication of a scientific paper is an exercise in apologetics. Charles Darwin’s Origin of Species was an exercise in apologetics since he sought to sell his interpretation of the observations that he had made on the Galapagos islands. It is common to think of apologists as playing the role of a criminal defence attorney who is committed to defending his client, come what may. In reality, however, a more apt parallel is to an investigative journalist, reporting for popular consumption the results of a fair and balanced inquiry.

Being an apologist of the gospel is no light responsibility. We are asking people to pledge their allegiance to Jesus Christ and dedicate every aspect of their life to His service. This may cost them greatly – even their life. The weight of this responsibility is emphasized by the apostle Paul himself, who stated that, if Jesus was not in fact raised, “We are even found to be misrepresenting God, because we testified about God that he raised Christ, whom he did not raise if it is true that the dead are not raised,” (1 Cor 15:15). We therefore owe it to those to whom we preach to study diligently the facts and arguments on both sides of the debate to ensure that the gospel is in fact true. We also owe it to those with whom we share the gospel to fully and completely inform them, as far as is possible, concerning the facts of the case. Too often I have seen apologists present popular arguments for Christianity but omit relevant facts that undermine the force of their argument. For some examples of this, see my recent conversation with Wesley Huff on arguments Christians should NOT use. Whenever you encounter an argument that is supportive of a position that you like, you should always, before publicly repeating the argument, conduct a thorough search for any relevant data that might reduce the evidential force of the argument. At the very least you should determine whether any academic publications, especially those critical of your beliefs, have already addressed the argument. This is but one of several ways in which you can reduce the negative effects of confirmation bias on your reasoning. 

What other steps can we take to mitigate against confirmation bias? I try to make it my habit to expose myself to more material – whether that be books, articles, podcasts, videos or other media – that argues against my beliefs than those which argue for them. This reduces the likelihood of me fooling myself, and forces me to think deeper and more carefully about my beliefs, and to develop a more nuanced expression of them. It also puts me in a strong position to find out that I am wrong if I am in fact wrong about any of my beliefs. A first step towards stepping outside of your intellectual echo chamber can be recognizing that smart people can argue in good faith and yet disagree with you.

I am sometimes asked how a newcomer to religious debates may discern which apologists to listen to and whom to disregard. Of course, the difficulty here is that, in order to discern which apologists can be trusted to give reliable content, one must have already attained a certain level of knowledge about the subject. But in order to arrive at that threshold of knowledge concerning the subject, one must first determine who to receive information from. How might we escape this dilemma? One criterion of several that I often give is to be wary of anyone who asserts that all of the evidence supports their own personal view and that there is none which tends to disconfirm it. Whenever anyone tells me, concerning any complex topic (whether that be theism, Christianity, evolution or anything else), that all of the evidence is on the side of their own personal view, it leads me to reduce my confidence in their objectivity with the data, and I begin to think that confirmation bias is particularly prominent in this individual’s reasoning process. It is an intellectual virtue to be able to admit that one or more pieces of evidence tends to disconfirm your own view. Of course, presumably you also maintain that the evidence that tends to confirm your view is stronger, on balance, than that which tends to disconfirm it. Nonetheless, recognizing the existence of difficult or anomalous data is a mark of scout mindset. And how might we go about determining whether a given datum confirms or disconfirms our Christian beliefs? For each piece of data we encounter, we should ask ourselves whether that datum, considered in isolation, is more probable given Christianity or given its falsehood. If the former, then it is evidence that is confirmatory of Christianity; if the latter, then it is evidence against. Too often I see people reason that, if a set of data can be made compatible with their beliefs, then they have neutralized the objection to their beliefs. However, this approach is quite simplistic. It is nearly always possible to make discordant data compatible with your beliefs. But that does not mean that the data is not better predicted given that your beliefs are false than that they are true, or that you should not lower your confidence in those beliefs. The appropriate question, when confronted with discordant data, is not to ask “Can I believe I am still right?” Galef rightly points out that “Most of the time, the answer is ‘Yes, easily,’” (p. 141). Rather, we should ask to what extent our confidence in our beliefs needs to be updated in response to this new data.

Another criterion of a credible apologist is that he or she is willing to offer critiques of arguments presented by others on his or her own side of the debate. Are they even-handed in subjecting arguments for their own view to the same scrutiny as those put forward by those on the other side of the debate? This reveals that they are discerning and have a genuine concern for factual accuracy. How one responds to criticism, both friendly critique as well as that from dissenting voices, is also a measure of one’s concern for correct representation of information. An ability to publicly retract false or misleading statements and issue corrections goes a long way to establish one’s credibility. When we encounter a new contributor to the debate, with whose work we have not hitherto interacted, we should also fact-check their statements, going, if possible, back to the primary sources – especially when they stray into territory outside of our own domain of expertise. If they are able to sustain a track record of being reliable in their reportage of information and fully informing the audience about the relevant facts, one ought to be more inclined to trust them as a credible authority. If on the other hand they have a habit of getting things factually incorrect, one should be very hesitant to take anything they say on their word.

One should also be wary of apologists who exaggerate the strength of their argument, over-pushing the data beyond that which it is able to support. It is always better to understate the merits of one’s arguments and pleasantly surprise by overproviding, than to overstate the merits of the argument and disappoint by underproviding. This is why in my writing and public speaking I prefer to use more cautious-sounding statements like “this tends to confirm” or “this suggests” rather than bolder statements like “this proves” or “this demonstrates.” Similarly, I will speak of being “confident” rather than “certain” of my conclusions.

My enthusiastic advocacy for integrity and nuance in apologetics, together with my insistence on subjecting arguments advanced in support of Christianity to the same scrutiny that we would subject contrary arguments to, has on occasion been misconstrued – by atheists as well as by Christians – as an indication of my losing confidence in the truth of Christianity. However, this does not at all follow and, frankly, it saddens me that Christian apologetics has come to be associated, in the minds of many, with a soldier rather than scout mindset. Clearly, it is possible to be convinced by the evidence that Christianity is true and yet still be committed to the honest presentation of information. It is also possible to believe that Christianity is well supported while also maintaining that many of the arguments advanced in support of Christianity are fundamentally flawed or dramatically overstated. I believe it is a virtue rather than a vice to recognize one’s own confirmation bias and thus take steps in the direction of reducing its negative effects on one’s reasoning. The principles that I have advocated in this essay are germane to apologists of any position, regardless of how convinced of that position they are. Otherwise, it is too easy to deceive ourselves, apply double standards, cherry pick data, and inoculate ourselves against finding out that we are mistaken in regards to one or more of our beliefs.

One may of course object to the principles advocated in this essay that, if unsound data or overstated arguments leads people to embrace the gospel, then the end justifies the means. I recall complaining, on more than one occasion, about the presentation of factually erroneous information in defence of Christianity at a University-affiliated Christian society in the United Kingdom. The response with which I was met, lamentably, was that it is very unlikely that any other of the attendees would know enough about the subject to pick up on the errors in the presentation, and we should rejoice that they heard the gospel. This thinking, however, is flawed for at least two reasons. First, we claim to represent the one who identified Himself as truth itself (Jn 14:6). Plenty of Biblical texts condemn the employment of deceptive methods (e.g. Exod 20:16; Ps 24:3-5; 101:7; Prov 10:9; 11:3; 12:22; 24:28; Col 3:9; Eph 4:25). It is therefore not honouring of God when we perpetuate misinformation, even in defence of the gospel. Second, if one with whom we have shared the gospel later does his or her own research to determine whether the things we have said are in fact true, much like the Bereans are commended for doing in regards to Paul’s preaching (Acts 17:11), we are responsible for having placed another obstacle between them and the gospel. This is a grave thing to be responsible for.

In summary, cultivating a scout mindset, and minimizing soldier mindset, can help us to think more clearly and with greater intellectual honesty about our beliefs and our reasons for holding them. I cannot recommend any more highly Julia Galef’s book The Scout Mindset. I would also recommend her presentation for TEDx Talks, “Why ‘scout mindset’ is crucial to good judgment.”

Footnotes

[1] Julia Galef, The Scout Mindset: Why Some People See Things Clearly and Others Don’t (New York: Porfolio, 2021).

[2] Dan M. Kahan, “Ordinary science intelligence’: a science-comprehension measure for study of risk and science communication, with notes on evolution and climate change,” Journal of Risk Research 20, no. 8 (2017), 995-1016.

[3] Caitlin Drummond and Baruch Fischoff, “Individuals with greater science literacy and education have more polarized beliefs on controversial science topics,” Proceedings of the National Academy of Sciences 114, no. 36 (Sep, 2017), 9587-9592.

1 thought on “Why Apologists Need a Scout Mindset: Lessons to be Learned from Julia Galef”

Comments are closed.

Share