The Scout Mindset — Not Yet Another Book Review

Rui Vale
7 min readMay 28, 2021

A book by Julia Galef, published by Penguin last April, The Scout Mindset: Why Some People See Things Clearly and Others Don’t.

The truth is out there. Stay put.

This book is about stopping deceiving oneself and viewing the world realistically without choosing between being happy and being a realist.

I came to know about it in an email exchange a couple of weeks ago with Rhys Lindmark, who’s spearheading a rather curious educational endeavour called Roote.

The book is also his starting point of developing a better approach to writing and sharing book reviews; Not Another Book Club, but a networked Not Another Book Review.

So, I got curious about the whole thing, bought and read the book. Here are some notes about what I found interesting in it.

First, I’d like to state that whatever text about having a proper mindset, especially if portrayed as a dichotomy, i.e., opposed to an improper one, puts me off instantly to the point of not giving it even a second look. On account of the mentioned exchange, I made an exception. Do I regret it? Still mulling, and this write up is also meant to clear that up.

The thing is about judgement, of the kind considered good, more precisely, keeping and improving it if you already have it, or get some if you don’t.

Now, judgement is a tricky one since there isn’t much self-proclaimed lack of it. It goes back to the rationalism kickstarter Descartes, who said good old common sense is pretty well distributed globally.

Julia, in the book, gives it the official name of “scout mindset”, a blend of virtues such as intelligence, cleverness, courage, or patience, particularly the motivation to see things as they are, not as one wish they were. It’s intellectual honesty, of the kind that allows you to recognise when you’re wrong, correct course, come out learning from it, and keep improving your — never good enough — judgement.

Then there’s the “soldier mindset”, reasoning motivated to defend the stronghold of our cherished beliefs, the status quo, the fast, biased thinking that keeps things we like as they are and copes with ones we gave up on, fighting off whatever threatening evidence that comes along.

From these contrasting viewpoints follows that your mindset can make or break your judgement, and knowing how to reason in a logically sound manner won’t be enough since we’re limited nearly as much by our knowledge as by our attitude.

It’s also interesting to note the parallel with Carol Dwek’s distinction between a “growth mindset” and a fixed one in her book Mindset[1]:

“The fixed mindset makes you concerned with how you’ll be judged; the growth mindset makes you concerned with improving.”

Of course, Julia states that the scout and the soldier are archetypes; we never stop jumping from one to another, depending on occasion and context.

The purpose of the book is to uncover how to predominantly lean towards being a scout under the guidance of the following three main ideas:

  1. Truth, particularly the adverse one, isn’t that awful, as you can always pivot towards other equally satisfying goals and still keep it accurate;
  2. There are tools to be learned that make it easier to see stuff clearly;
  3. It’s possible to appreciate the emotional rewards of adopting a scout mindset.

The soldier mindset is mainly fueled by what she calls motivated reasoning, serving a self-protecting function over six overlapping categories: comfort, self-esteem, morale, persuasion, image, and belonging. We’re the first three target, taking comfort in sour grapes and sweet lemons[2], assurance in self-told flattering narratives about unflattering facts, and boosting morale by focusing on the upside while ignoring the downside. The remaining three target other people, people we want to be convinced of our views, impressed by our stance, and willing to welcome us in their tribe.

So, the trade-offs are mainly about adopting and defending beliefs vs seeing things clearly and emotional and social benefits from belonging vs good judgement calls. The thorny difficulty lies in the fact that most such trade-offs occur behind the scenes; they’re unconscious trade-offs.

But we’re not rationally irrational. Julia takes a bold stand against Bryan Caplan’s hypothesis[3], which roughly states you can let yourself go epistemically berserk up to the point of ensuring no one goes ballistic at you while getting away with whatever you want to achieve. More formally, we tend to balance epistemic rationality by holding on to well-justified beliefs and instrumental rationality to achieve our goals effectively.

The central stance is that truth is more valuable than we realise and, although challenging to strive for, is worth it, mainly because feeling objective, smart and knowledgeable doesn’t make you so. Learning and practice are required to become a scout.

Start with your biases. For that, there are tools — thought experiments — to help you on that journey, such as the double standard test, the outsider test, the conformity test, the selective sceptic test, and the status quo bias test. Most are self-explanatory, but none is an oracle; it just helps you notice that your reasoning tends to change in tandem with your motivations.

Maybe a sign you should regard your judgements as a starting point for exploration, not an endpoint; up to a point, of course.

Then, address your certainties. Take the equivalent bet test, see if you can tell the difference between the feeling of making a claim and the feeling of actually trying to guess what’s true.

If you’re brave, take a shot at how it’s like thriving without illusions. Cope with reality by challenging conventional wisdom regarding the costs and benefits of honest vs self-deceptive ways of coping with the world.

And some of it isn’t just conventional. Daniel Kahneman, the prospective guru, in his magnum opus[4], singles out resilience as an emotional benefit of motivated reasoning from some amount of self-justification.

Julia, however, argues one should take extra care in selecting coping strategies that don’t mess with judgement’s accuracy instead.

And she goes on to debunk the “self-deception causes happiness” research and advocate for motivation without self-deception. To the “follow your dream” mantra, she counters the idea that an accurate picture of your odds helps to choose between goals and take bets worth taking, not just from the likelihood of success but also from the expected value. One should come to terms with the fact that there’s always some element of chance involved, separate the role it plays from the part of one’s decision-making, and judge oneself based on the latter.

You can also influence others without showing off too much. The more confidence you can muster in your beliefs, the more influential you’ll be, says common wisdom. But you’ll have more substantial influence by showing uncertainty is justified, giving informed estimates, explaining where they come from, making it unnecessary to overstate it, and having a plan.

You have to learn how to be wrong and change your mind accordingly. It’s not like being wrong equates to having screwed up. Recognising you were wrong makes you better at being right; it’s an opportunity!

Embrace confusion. Don’t explain it away. Resist the urge to dismiss details that don’t fit your theories. Poke into oddness, view anomalies as puzzle pieces to collect as you go through the world; they’ll pile up, eventually leading up to a paradigm shift, Kuhnian style.

Not for Eureka’s sake, but for the rubbing one gets at saying: “That’s funny!”

Another of Julia’s peculiar insights is that escaping your echo chamber doesn’t work. I gather listening to the “other side” — being it more like the other pole — only excites your soldier mindset — I’d even add that it might also include the polarised media from “your side”.

It’s more complicated than expected because we tend to misunderstand each other’s views when at the opposite ends of the spectrum. Bad arguments innoculate us against the good ones. Our beliefs are interdependent, turning a concession into an epistemic earthquake.

So, listen instead to people with whom you share some intellectual common ground, you find at least a bit reasonable, and share your goals.

In the last part of the book, Julia proposes to rethink our identities, more precisely to hold them more lightly.

It starts by realising that some beliefs have become part of one’s identity, assumptions that make one feel either proud or embattled. Look for signs that expose them as such: using the phrase “I believe”, getting annoyed when contradicted, use of defiant language or righteous tone, playing the role of a gatekeeper, taking comfort in the misfortune of the proponents of opposing beliefs, use of epithets, and feeling obliged to defend one’s views.

See if you can pass the ideological Turing test; explaining it as a believer would, convincingly enough that other people won’t tell the difference between you and the true believer.

Finally, flip the script on identity by making the scout mindset part of it. Even so, you can still choose the communities to be part of, the kind of people you’ll attract, and your role models.

Lightly, that is. This last recommendation also carries the risk of carving a so strongly held identity to the point of preventing one from persuading others.

Despite being well-referenced, It’s mostly a personal book, with a fair dosage of anecdotes derived from the author’s journey as a scout. Arguably the most fun part.

Besides a cornucopia of examples, in the book, you’ll also find a handy questionnaire to help you calibrate your estimates, and an amusing collection of Mr Spock’s predictions and outcomes, drawn from no less than 23 Star Trek movies and episodes.

Being published recently, there aren’t many reviews yet. Beyond the canonical Goodreads, you can find good ones at the WSJ, Reset Work, and at the Fire.

[1] Dweck, C. S. (2006). Mindset: The new psychology of success. New York: Random House.

[2] A common rationalization device by which people adjust their judgments on the desirability of a future event to make them congruent with its perceived likelihood, like the fox in Aesop’s fable, walking away from the grapes saying they’re sour because it couldn’t reach them, or the opposite, like when there are only lemons around, you’d rather perceive them to be sweet and desirable.

[3] Caplan, B. (2001), Rational Ignorance versus Rational Irrationality. Kyklos, 54: 3–26.

[4] Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.



Rui Vale

Cherubinic debunker on business, life, and technology