806 Comments
Comment deleted
Expand full comment
Comment deleted
Expand full comment
Comment deleted
Expand full comment
Comment deleted
Expand full comment

"(sometimes this is be fine: I don’t like having a boring WASPy name like “Scott”, but I don’t bother changing it. If I had a cool ethnically-appropriate name like “Menachem”, would I change it to “Scott”? No. But “the transaction costs for changing are too high so I’m not going to do it” is a totally reasonable justification for status quo bias)"

Hmm yes, transaction costs of changing a name are high. After all, it's not like you had been writing under a pseudonym for many years that could have been anything you wanted, Menachem Alexander

Expand full comment

Scout Mindset is, accidentally, a really great general self-help and CBT book, that doesn't talk down to you

Expand full comment

Typo thread: "sometimes this is be fine"

Expand full comment

Agreed. I already think "what would Julia Galef do."

Expand full comment

I feel compelled to plug my friends' and my fan project, mindsetscouts.com. Everyone likes earning cool merit badges! One of the merit badges even has a link to a calibration spreadsheet template, if that is a thing you've always wanted but never wanted to bother making.

Expand full comment
founding

This excellent review convinced me to buy the book. One area of life where it is fine to have a pure soldier mindset is being a sports fan. You start rooting fro a team when you are young because of where you live, your family, whatever. You never consider changing your allegiance (although if, like me, you're a Mets fan, you sometime wish you could). If we all have a certain amount of "soldier"tendency in us, then sports fandom is a good and healthy way to exercise and partially exhaust that tendency.

Expand full comment

I think a bronze age mindset reference would really round out that first paragraph.

Expand full comment

On the political scandals thing, one noticeable trend is that for people who support party X, they have to weigh the cost of having a politician who has done something scandalous against the risk of getting a politician from party Y, which is (for supporters of party X) intrinsically scandalous.

So it's not unreasonable to have a higher standard for your opponents (whose replacement is costless) than for your own party.

The other factor is about who will replace them, which is why I favour political systems that make it easy to replace a politician with another of the same party (and make that a separate process from the electoral process where voters choose between parties). Note that Al Franken was replaced as Senator easily - but his replacement, Tina Smith, is of the same party. Ralph Northam was also in a scandal of comparable magnitude, but remained as Governor of Virginia because his two same-party successors were involved in the same scandal and the third in line was of the other party. You can see the same process with the recent California recall; Gavin Newsom was able to ensure that the only possible replacement was a Republican and was able to run against that Republican. From the perspective of the Democratic majority in California, however bad Newsom was, a Republican would be worse.

The only case I can think of in recent years when a politician has been replaced by one of the opposite party as a result of a scandal is the Alabama Senate election when Roy Moore was defeated by Doug Jones.

Expand full comment

Very interesting and a compelling endorsement. This review is a good prompt to think about my own relationship to the whole rationalist project, and I need to read this book. I am much more sympathetic to rationalism than I was say 5 years ago and think on balance it's a force for good. I also think it's a giant motte and bailey, which is frequently discussed in grand and outsized terms regarding its goals, but when challenged its members tend to say things like "oh nobody's trying to achieve real rationality, we know that's impossible, we're just trying to get people to be a bit more rational." But I think what I will do is read and review this book and use it as a lens to think through the movement and its evolution.

Expand full comment

"They founded a group called the Center For Applied Rationality (aka “CFAR”, yes, it’s a pun) to try to figure out how to actually make people more rational in the real world."

As if Scientology wasn't enough...

Expand full comment

What would Scott’s review be if he wasn’t personal friends with Luke and Julia? (Probably similar but longer?)

Expand full comment

Nobody's complained about p < -10^10 yet, which, depending on where you put your parentheses is either impossible or certain :^)

Expand full comment

I think I might buy the book, then.

I feel like there's a kind of deeper level of reasoning that often goes into people being unwilling to change their mind, or unwilling to adopt a 'scout mindset'. In the real world, military scouts tend to get killed a lot. They go into enemy territory surrounded by soldiers where they're at a disadvantage. Soldiers at least get to fight in groups. The epistemic equivalent of a scout being "killed" would, I think, be being convinced or pressured to change your mind based on non-rationalist tactics. If this happens a few times then your Bayesian prior on "I am going to be misled/pressured/BSd into changing my mind" starts going up and it stops making sense to be a scout. It starts looking smarter to be a soldier.

In the past I've changed my mind about lots of things - I can think of a few examples from both politics and my job right now. But I sort of feel like this has happened to me with everything about COVID. In the beginning I adopted the default position of "the scientists have got this" and believed all their predictions. Then I read an article that gave me lots of new information about the history of epidemiology and the unreliability of Ferguson's modelling, and that caused me to go off and do lots of research of my own that backed it up, so I changed and adopted a position of "this is all wrong and my god scientists don't seem to care which is scandalous". But I tried to keep a scout mindset (not calling it that of course) and would still read articles and talk to people who disagreed, in fact, I ended up debating a scientist I was vaguely acquainted with in the street for about 20 minutes. We met, said hello, not seen you for a while, got talking and ended up in a massive debate about PCR testing, modelling and other things. He was very much still where I started so it was a vigorous debate.

The problem is, a very frequent experience was reading or hearing something that was causing me to update my beliefs somewhat to be closer to the "mainstream" (i.e. media/government) narrative. But then a little demon on my shoulder would whisper, "shouldn't you double check that? after all, these guys were misleading you before" and then I'd go check and sure enough, there'd be some massive problem with whatever I'd just read. In many cases it was just flatly deceptive.

After this happens a bunch of times your prior on "these people who disagree with me are probably bullshitting me" goes high enough that it's really hard to maintain a scout mindset. After all, clever bullshitters do it because sometimes they succeed. If I find myself becoming less certain and updating my beliefs in the direction of "I was wrong", how do I know that this is a correct decision, given all the previous times when it wouldn't have been? This feels like a systematic problem with attempts to do scout-y Bayesian reasoning in a world where people are actively trying to recruit you to their side via foul means as well as fair. I suspect this explains a lot of politics, although it doesn't mean the perception of being misled is true, of course.

Having written all that, I have no solutions. The scout mindset would work best in an environment where people are genuinely playing fair. If they aren't, then maybe it's one of those times the book alludes to when the soldier mindset actually works best.

Expand full comment

I find myself in disagreement with this use of the word "probability", but I realize it's because I am a soldier for frequentism

Expand full comment

I'm curious about the theory of psychotherapy mentioned toward the end. Is there a name for this theory? I'd like to read more about it.

Expand full comment

So what I'm hearing is, "the best way to be a good person is to be like me and my personal friends. The community I, personally, am involved in is the main one to avoid the scourge of confirmation bias -- the bias in which people think they are never wrong". Got it.

Expand full comment

Does the book touch on the problem of Going Public? It's easier to change your mind when your opinions are private. At least that's been my experience.

I see increasing polarization as partly an effect of more opinions being public because of social media.

Expand full comment

“It’s hard to be rational without also being a good person” looks like more evidence against the orthogonality thesis. If you’re intensely intelligent but don’t see “being mean to other people as an error”, you’re much more likely to dismiss them when they have true knowledge that conflicts with your priors.”

And if rationality is just as hard as being a good person, doesn’t this suggest that an unaligned AI is likely to have biases which inhibit it’s abilities as well?

Expand full comment

I've seen a lot of press for this book, but reading this review was the first time I realized that "scout" in the title refers to scouting something out, and not glorifying the boy scouts. I think it was the cover image of the binoculars looking out over a landscape right out of a national park.

Expand full comment

Does Galef, in a spirit of fair play, mention Mercier and Sperber's theory of argumentative reason?

Expand full comment

I hate it, so it's probably good advice.

Well, to say "hate" is too strong. But the thing is, scouts are just as much part of the army as soldiers. Scouts are still "on a particular side" and are working against the scouts from the opposing army. What if you don't want to fight in any war at all, you just want to find out things? Is there such a thing as The Nature Rambler Mindset?

Second, the Obama anecdote strikes me not as "wow, strong BS detector, how admirable!" but as "what a jerk". (And yes, I did the "imagine it's a guy you like" part to check out my reactions).

I'd hate a boss like that, who was constantly whip-sawing between "I love it/I hate it" in order to 'catch me out'. You could never be sure what his real opinion was, when he had genuinely changed his mind, and when it was "he always loved/hated it, he was just pretending the opposite". Plus, if the people working with him have any brains at all, they will figure out the strategy after he does it a few times, then they will always have two sets of opinions ready to go at all times - if Obama says "I love this thing!", be ready to go with "Eh, I'm not so sure"; if he says "I hate it!", be ready to go with "Oh, there are good parts". That way you can always turn on a sixpence when he goes "Ha, fooled you, I hate/love it!"

It'll trip everybody up when he goes "No, honestly, I really do love this" "Yeah, sure, Mr. President" *wink* "I know how this goes, you don't want yes-men!" "No, seriously, I do believe this" "Ha ha, can't catch *me* out, I know this means you hate it!" but at least Mr. Big Guy can flatter himself on his sterling bullshit detector.

You're correct that this will require a lot of change and a lot of work to improve yourself.

Expand full comment

"I talk a good talk about free speech, and “don’t cancel other people for discussing policies you don’t like, they have a right to their opinion and you should debate it instead”."

But I've noticed that you've banned plenty of plenty of people from the comments for politics you don't like. Is there really a "scout mindset" within Rationalism?

Expand full comment

Interesting book review. Could you have someone read it through and correct the mistakes? Driving me nuts reading it and correcting them in my mind so that I can understand your points, so carefully made!

Expand full comment

This is perfect: "the damned could leave Hell for Heaven at any time, but mostly didn’t, because it would require them to admit that they had been wrong."

Expand full comment

That last story about Luke really struck a chord with me, because I also sometimes alienate both sides by not changing my mind UNTIL I get a good argument, and insisting that the bad arguments remain bad.

Two examples:

1) most of the arguments I heard against “Intelligent Design” and for “Undirected Evolution” were crappy; the ID people had some good points and were unfairly maligned. But then I figured out what was wrong with the “irreducible complexity” argument in a way that I had never seen properly explained, and with that insight, was able to go back and see that SOME of the IDers were bullshitters (while others still made some valid points, and while I still find many of the evolutionists to be bullshitters, the evidence definitely points to evolution for now although I can think of several ways in which my mind could be changed by new investigation).

2) The opposition to the usefulness of the IQ measure, and to the degree of heritability that it seems to exhibit, always struck me as hysterical bad faith, which refused to engage in any non-sophistical way with the data and arguments put forward by the IQ proponents. But then I saw Nassim Taleb’s arguments against IQ, and THEY were mathematically valid and more penetrating than what the other side was saying, and most of the IQ proponents failed to engage with them or argued badly against them. I think Taleb, for polemical reasons, goes too far in his opposition, and that heritable individual and group differences need to be reckoned with, but the IQ number itself, and the way in which it is applied, are definitely messed up and a lot of the people (but not all!) who support it can now be seen as bad scientists or “motivated reasoners”.

Expand full comment

I worry about the opening lines, fearing I have also ended up behind the curve, since I have worked for the last six years on-off on a liebhaber project on similar issues. (The book should be finished this year, unfortunately I say that every year – a blatant example of Kahneman’s planning fallacy.)

Over the years I have become gradually more sceptic about the Bayesian approach to rational decision-making (which seems to be the underlying approach of the book, as well as Scott's review); not least since it does not correspond with how I form opinions and make decisions, if I introspect & try to be rational about it.

Instead, I have become rather enthusiastic about Paul Thagard’s idea of “inference to the best explanation”, which (again using introspection as a method) is closer to how I actually make & change my opinions, and decide on things.

If someone who reads this blog has opinions on the “inference to the best explanation” approach, including how it fits (or not) with interpretations of Bayesian reasoning, I would be very interested in your thoughts – as well as tips on literature.

For those who want a short version of Thagard, a nutshell version is in the open-access Journal Philosophies 2021, 6,52, titled “Naturalizing logic: How knowledge of mechanisms enhances inductive inference”. The core idea is that we form opinions based on the perceived plausibility of “mechanisms”, and that deciding the plausibility of various “mechanisms” is a question of continuous inference-to-the-best-explanation.

Here are snippets from the article, including some of Thagard’s critique of the Bayesian approach. Again, I would be interested in reactions to his critique & line of reasoning, by people who have opinions on these things:

“Should probabilities be construed as frequencies, degrees of belief, logical relations, or propensities? …

Bayesians assume that probabilities are degrees of belief but face problems about how such subjective beliefs can objectively describe the world and run up against experimental findings that people’s thinking often mangles probabilities. …

I think the most plausible interpretation of probability is the propensity theory, which says that probabilities are tendencies of… situations to generate long-term relative frequencies….

What does it mean to say that glass has a disposition to break when struck? Fragility

is not just a matter of logical relations such as “If the glass is struck, it breaks” or counterfactuals

such as “If the glass had been struck, it would have broken.” Rather, we can look

to the mechanisms by which glass is formed to explain its fragility, including how poorly

ordered molecules generate microscopic cracks, scratches, or impurities that become weak

points that break when glass is struck or dropped. Similarly, the mechanisms of viral infection, contagion, vaccination, and immunity explain the disposition for people to be protected by vaccines.

Mechanisms flesh out the propensity interpretation of probability and point toward a new mechanistic interpretation of probability. … propensities point to unobservable dispositional properties of the… world.

…The second problem with Bayesian approaches to inductive inference is that the

relevant probabilities are often unavailable, no matter whether they are construed as frequencies,

degrees of belief, or propensities…. Paying attention to mechanisms helps to constrain identification of the probabilities that matter in a particular inferential context. For example, understanding the mechanisms for infection, contagion, vaccination, and immunity makes it clear that many extraneous factors can be ignored, such as demonic possession.

…To sum up the result of this assessment, we can judge a mechanism to be strong, weak,

defective, or harmful. A strong mechanism is one with good evidence that its parts, connections,

and interactions really do produce the result to be explained. A weak mechanism

is one that is not superficial but is missing important details about the proposed parts,

connections, interactions, and their effectiveness in producing the result to be explained.

Weak mechanisms are not to be dismissed, because they might be the best that can be done

currently, as in the early days of investigations of connections between smoking and cancer

and between willow bark and pain relief.

…The social significance of the role of mechanisms to inductive inference comes from

the need to differentiate misinformation from information.... Separating information from

misinformation requires identifying good patterns of inductive inference that lead to the

information and defective patterns that lead to the misinformation. Noting the contribution

of mechanisms to justifiable induction is one of the contributions to this separation.

Expand full comment

Loved this & I'm fascinated by the connection between not being an intellectually obstinate prick & increased mental wellness. As a recovering intellectually obstinate prick I've noticed that calibrating one's beliefs somewhere around the mid-values a certainty scale makes me less hateful & contemptuous of people I think are probably mistaken. Consequently I've decided that a policy of radical uncertainty may be quite rewarding.

Expand full comment

I appreciate Julia's perspectives.

Expand full comment

> Like - a big part of why so many people [...] moved on was because just learning that biases existed didn’t really seem to help much.

I've made this point a few times. I brought it up just a few days back. As much as rationality bills itself as sytematized winning there doesn't seem to be a lot of focus on helping people win. Everyone has a different definition of win but I'd argue the broadest, most normal definition is something like: have work you enjoy that pays you a lot of money which you manage into financial independence while improving your health and having lots of good friends and a loving spouse.

I've read precisely zero advice on how to achieve any of that from the rationality community. Or anything like it. Or how to achieve any alternate version of winning. What advice exists seems incidental to the larger project of bias spotting and correct thinking. I enjoy this and I enjoy the debate. But I honestly feel like it's a hobby rather than anything that improves my life.

That's fine as far as it goes. But when people point me to EA or CFAR it just looks like more consumption and hobbyism rather than self-sustaining change or self-improvement. Frankly, this strikes me as a problem for the rationality movement. Yes, it's nice to be rigorous and correct. But at the end of the day most people respect success above all else. Doomed intellectual victors still are doomed. The most successful ideologies whether religious or secular have traditionally promised people real benefits in this life or the next. Rationality just kind of skips this. Which is weird to me: wouldn't it be rational to seek a way to win and propagate those methods?

Freddie posted a bit below. Freddie is an honest, no BS communist (or does he identify as a socialist? some kind of far leftist). He would have no problem pointing out the specific material benefits of his ideology both on the individual and societal level. You may disagree whether his ideology achieves them but you can't disagree that he claims they are the goal or that his movement will try to create them for you. Decommodification, the end of anomie, improved terms on your specific rental agreement, better working conditions, etc. Christianity would have no problem pointing out a similar list of benefits. What's rationalism's equivalent?

Expand full comment

The opening paragraph literally made me laugh out loud. A+ introduction.

Expand full comment

"Scout Mindset"

The first image that came to my mind was a Boy Scout helping an old woman across the street. Not to dismiss the urge to be helpful, but I see now the subject is broader: the search for truth.

I was asked a few days ago, why I write. I responded,:

>>My motive in writing is to seek the truth. Writing things down helps me to clarify my thinking. Over time I learn more information and I edit my previous work. Sometimes I get feedback which is helpful and I edit some more.

When I'm talking/writing with someone and it touches on stuff I've already written, I can just give the link rather than trying to express it again.

I do hope my point of view sort of osmoses into the cloud, but I don't expect this.<<

I started following Scott in a casual way because he said some insightful things about basic income (one of my main interests).

https://slatestarcodex.com/2018/05/16/basic-income-not-basic-jobs-against-hijacking-utopia/

Slowly I became aware that Scott is a proponent of something called "rationality" and perhaps a member of a “rationalist community”.

Rationality does sound like a good idea and being a scout a better goal than being a soldier.

Expand full comment

Scott, America has been on the metric system for nearly two centuries! The American Customary System, the law of the land, is a metric-based transitional units system that defines legacy English units and encourages, but doesn’t require, people to switch to metric. This law passed in the early 1800s! The people who have switched are people who had a high motivation and low cost to switch (science, with lots of custom tooling) and the people who didn’t had a high cost to switch (highways, heavy industry with all its existing tooling).

On the topic of the post, at one point in the book, Galef advocates presenting points of view in such a way that people don’t know if you’re for or against it. This is good! You can take it even a step further when there are multiple positions - say, in an argument - and you present them such that nobody can tell which one(s) you support. That delivery shows your respect for the person you’re debating, even if you strongly disagree with their ideas.

That sort of clinical explanation also lends itself well to deadpan humor, if you’re in to that sort of thing.

P.S. my calibration numbers (for buckets 55%, 65%, 75%, 85%, and 95%) were 67%, 71%, 80%, 86%, and 100%. Uniformly under-confident. But I also go around saying “most tautologies are true” so maybe I should have expected it.

Expand full comment

You can take an automated version of the calibration quiz in the book (the same questions) here https://calibration-practice.neocities.org/

Expand full comment
founding

> Someone who updates from 90% to 70% is no more or less wrong or embarrassing than someone who updates from 60% to 40%.

The size of a Bayesian update can be quantified by the difference between the values of the logit function, ln(p/(1-p)), for the prior and posterior probabilities. Going from 90% to 70% is a bigger update than 60% to 40%, and going from 100% to 80% is an infinitely large update. If embarrassment is quantified by "how wrong was I before I made this update?", then 90% to 70% is indeed more embarrassing than 60% to 40%.

Expand full comment

A message to all you dudes who tried to employ scout mindset and didn't achieve perfect rationality:

You all failed because you were not true scouts, men.

Expand full comment

I worry that too little attention is given to the acknowledged point that it is perfectly appropriate to treat new evidence differently based on your prior assessment of the totality of the evidence in the aggregate. If I learn some new datum, but I come in with a holistic worldview under which my belief in this area is strongly held and commonsensical then it isn’t necessarily confirmation bias to treat the datum fairly dismissively if it contradicts that much larger, integrated understanding of reality. This strikes me as a much more fundamental obstacle to changing minds than is presented in the review, not least because it may be wholly rational. It seems really hard to tease this apart from situations where cognitive biases may be at work.

Expand full comment

The change of view in Julia's book, from "teach people methods of belief update" to " teach people to be less defensive about their beliefs" is in line with Stanovich's data about my side bias. If I understand correctly, he indicates that the political divide results mainly from expressive rationality: people want to signal affiliation and group identity, not change beliefs.

Reading the book I found the emotional part more helpful, but I wonder how effective the tests are to counter partisan tendencies.

Expand full comment

Calibration score

Mean confidence: 75.92%

Actual percent correct: 69.39%

You want your mean confidence and actual score to be as close as possible.

Mean confidence on correct answers: 79.71%

Mean confidence on incorrect answers: 67.33%

You want your mean confidence to be low for incorrect answers and high for correct answers.

I think I was doing this wrong. When I had little idea I chose 50%. I was thinking "50-50 chance of being right". I think now I should have been choosing something closer to zero.

Expand full comment

Can I put in another desperate request for our host to demand an ignore-user function from Substack if he won’t just ban a certain commenter outright? You’ve written a wonderful review of what sounds like a wonderful book and the comment thread was shaping up to be great until a certain individual started with the usual argumentative spam. Possible Chrome plug-ins are of no use given that not everyone does all their reading in Chrome on a desktop.

Expand full comment

Great write-up Scott. I liked how you pointed out Julia's coping mechanism section as a way to differentiate the book. The section reminded me of the Second virtue of rationality, the "If the iron is hot, I desire to believe it is hot, and if it is cool, I desire to believe it is cool." Easier to say than to actually feel.

Additionally, I would like to add a new internet law.

Godwin's Second Law: The longer the AXC comment thread, the more likely marxbro is involved and derailing the discussion.

Expand full comment

The status quo / conformity bias is a terrible tool, I found myself agreeing with all sorts of majority positions just because they'd be contrarian in the counterfactual world.

I assume if we summed our biases with a regular person we'd be a rational oracle.

Expand full comment

How would Julia Galef want us to think about Marxbro's activities in ACX comment threads?

Expand full comment

> in case this is starting to sound too touchy-feely, Julia interrupts this section for a while to mercilessly debunk various studies claiming to show that “self-deluded people are happier

Intuitively I'm very convinced that self-deluded people are happier, but I cannot wait to change my mind!! Anything one can read about that on the internet? Is she pointing to scientific papers?

Expand full comment

Perhaps the book's analysis could benefit from a thorough examination of its own premise by first answering the question: "What is a 'Mindset,' and why do you need one anyway?"

It sounds like the book's premise is that everyone's ideal "mindset" is to be a relentlessly objective truth-seeker. That's fine and well in a domain that potentially has objectively correct answers to questions (e.g., what is the relationship between increasing CO2 levels and the temperature of the atmosphere?) But what's the correct "mindset" when addressing squishy value-laden controversies that don't have objectively right or wrong answers? (e.g., is it morally wrong for humans to change the Earth's climate and ecosystems when the economic benefits of doing so outweigh the economic costs?)

Is one's "mindset" solely concerned with one's personal beliefs? And is there any reason one's personal "mindset" must always be aligned with the positions the person advocates publicly or attempts to persuade others to believe. For example, is it wrong to confidently argue for a position -- like a lawyer advocating for a client -- that you don't 100% believe yourself. After all, if the argument is persuasive to others, why does it matter that you don't find it persuasive yourself. Everyone is entitled to their opinion.

And what about those situations where one's (apparent) "mindset" can, by itself, alter reality in the manner of a self-fulfilling prophesy. To use a recent example, was it a good mindset of "radical honesty" for U.S. leaders to publicly acknowledge that the Taliban would probably take over in a few months after we left -- which caused the Afghans to conclude that they should just throw in the towel early to get a better deal from the Taliban. Wouldn't it have been a more proper mindset to over-project confidence in order to rally the troops to the cause? And what if the only way to effectively project self-fulfilling optimism is to employ "confirmation bias" to convince yourself?

I guess the main definitional questions about "mindset" are: (a) When (and why) it is important to be intellectually honest with yourself? and (b) When (and why) do you have to fully disclose your personal beliefs to others?

Expand full comment

I'd just like to say that I'm proud of my confidence calibration - I was within 1% of my confidence calibrated expected correct rate on the exercise linked in the post, and within the expected range for each confidence interval. But now the meta-confidence question, can I be too confident in my confidence calibration?

Expand full comment

I just started Stanovich's book, _The Bias That Divide Us_, and it looks like it deals with the same problems, although maybe in a more formal/rigorous way.

Is it worth also reading Galef's book?

(There is also Pinker's latest that I had pre-ordered and just dropped into my Kindle library; that's a lot of book about rationality in just a few weeks.)

Expand full comment

Nice try getting out of voice-in-my-head responsibility Scott, maybe next time.

Expand full comment

Reminiscent of Jane Jacob's Guardian and Commercial "Syndromes" from her Systems of Survival.

Expand full comment

Journalists ought to have a Scout mindset, but often, they have a Soldier mindset on behalf of a political ideology.

Expand full comment

I have a preliminary scout report for my fellow global warming skeptics (dozens of us?)

This is how much we can't win on basic narrative grounds:

I've heard that in the era of "The Coming Ice Age" cover pages, 90% of published papers in fact predicted future warming from CO2. Media narratives warp reality like a massive particle traveling 99% of the speed of light - and the narrative is now cannons blazing on the side of those 90% plus warming predictions

Here's the extremely good news:

The climate economists are on our side, big time. First redefine our side as "doesn't want a trillion dollars a year to be wasted on useless policies". In principle that includes almost everybody. It's time to go big tent and to do it on policy grounds

How much are the climate economists on our side? The Nobel winning Nordhaus calculated the economically ideal rise in temperature to be 4.5 degrees in terms of warming costs vs costs of an ideal carbon tax. I suspect these numbers are getting skewed lower now with motivated adjustments but suffice to say they are not proposing short term net zero or holding temperature to no more than 1.5 degrees above where it was at the end of the little ice age

So what do we do? I think our bread and butter goal is to commission policy recommendations from climate economists. How do you think alarmists will fare arguing in favour of "don't consult the climate economists"?

They will say mid to long term a global carbon tax (but not very high very soon) and if we ask short term they may reccomend direct investment into green energy technology. Subsidies will not be in the mix. Short term hard reductions of emissions will not be in the mix (ie: the Paris Accord)

A potential additional strategy would be to advocate tests of marine cloud whitening in equatorial countries where warming is already likely a net negative along with a program guaranteeing crop insurance to allay fears of precipitation effects. Remember the prime reason the UN is advocating a 1.5 or 2 degree cap despite the economists projecting massive overcosts is out of a sense of guilt and obligation to the hardest hit. This is admirable, and can be cut off at the pass by a cloud whitening project that helps those people immediately at a price in the range of 35,000 times more cost effective than directly scrubbing carbon from the air

How does this sound global warming skeptics?

Or are there any double agents with comments on how the general strategy and specific ideas sound to you?

Expand full comment

I'm 21and live a relatively healthy lifestyle. What odds would you give me living 200+ years?

Expand full comment

Strange that all the examples in this genre of how to change minds are always climate skeptics becoming climate believers. Maybe there is something minor to be learned here.

I imagine this was just a throw away example, but climate change is also a bad choice because this isn't an argument over a specific isolated provable fact, it is an entire encyclopedia of claims that range from the validity of temperature record (the known), the certainty of catastrophic extinction level outcomes (the highly speculative and unsupported), and the effectiveness of proposed solutions (risk/cost/benefit analysis). Yet one must either believe or not believe in the entirety of the claims. There is a near instantaneous judgment on what group you are in based on any initial statement. My lead sentence would trigger most environmental activists. I'm not sure what the "believe it all or be excommunicated" behavior is called but there should be a name for it.

Expand full comment

If you're looking for a place to practice the Scout mindset and sharpen your rationalist skills, come to the Guild of the ROSE! (guildoftherose.org)

Expand full comment

> Of the fifty-odd biases discovered by Kahneman,

> Tversky, and their successors, forty-nine are cute

> quirks, and one is destroying civilization. This last

> one is confirmation bias...

PutANumOnIt makes a similar point:

https://putanumonit.com/2021/01/23/confirmation-bias-in-action/

Opening paragraph:

> The heart of Rationality is learning how to actually

> change your mind, and the biggest obstacle to

> changing your mind is confirmation bias. Half the

> biases on the laundry list are either special cases

> of confirmation bias or generally helpful heuristics

> that become delusional when *combined* with

> confirmation bias.

Expand full comment

CMV: /r/changemyview is bad.

I think it's easy to portray it as good. Look at these noble redditors, humbly submitting that their deeply held beliefs are misguided. It takes a a great deal of introspection to try and spot the flaws in your own thinking and a great deal of courage to read numerous detailed arguments from your opponents.

And the idea is definitely good. I just think there are some big problems with the execution.

1. It's unidirectional. Submitting a post requires one to be willing to change their view, but commenting on a post does not have this requirement. Granted, it's possible for commenters themselves to be convinced and award deltas, but this is pretty rare, and the sub is clearly built around the idea of the OP changing their opinion. It comes across as "let's hold a debate, but we're going to make it so one side wins" from the outset. I think requiring that commenters be willing to change their views as well would make it a much better place.

2. It can lead to groupthink. I think the people most likely to submit posts on CMV are those with unpopular opinions. Then a hoard of redditors holding the popular reddit opinion respond and argue the OP into submission. This seems more like the OP caving to peer pressure than practicing rationality. In fact, I'd go as far as to say that I think a lot of OPs WANT to have their views changed so that they can hold the "correct" opinion, and CMV allows them to do this gracefully.

3. It incentivizes winning arguments over arriving at the truth. In the Phoenix Wright, they like to introduce the antagonist prosecutors as having "never lost a case." How intimidating! This doesn't mean that they're good at arriving at the truth though, itjust means that they're clever and good at making logical arguments (because how likely is it that the defendant was truly guilty ever single case they prosecuted?). I feel like having a monthly deltaboard incentivizes that kind of behavior--"I don't need to help the OP defeat their bias, I just need to win the argument." And then, given that a clever person might be able to convince you of either side's correctness in a debate, you end up in an epistemic learned helplessness situation.

4. This is a flimsy argument, but it just feels like people give up too easily on that subreddit. A lot of the time I'll really agree with the original opinion expressed by the OP, but they give in to the first obvious counterargument, and then it's over. I think because the OP is SO willing to change their view, debates end too quickly and a lot of questions are left unanswered.

In the spirit of this post, I'm definitely willing to change my position, but I can't guarantee that I will.

Expand full comment

Well I can add that my wife would be in favor of a change to ‘Menachem’ so I don’t have to go through the whole “You remember that Scott Alexander guy I was telling you about…” bit when I want to show her an interesting thread.

Menachem she would remember.

Expand full comment

Bronze Age Mindset next

Expand full comment

Am I the only person who noticed that she's just lifting Myers Briggs P vs J and then declaring P superior?

"A Soldier’s goal is to win the argument, much as real soldiers want to win the war. ....Scout Mindset is the opposite. Even though a Scout is also at war, they want to figure out what’s true. "

This is extremely similar to the defining test for P vs J ("Js think a meeting is productive if everyone knows what to do, Ps think a meeting is productive if every issue has been discussed and understood."

It's fricking hilarious, of course, because in the real world Js run roughshod over Ps and that's what would happen in real life as well with "Scouts" and "Soldiers".

The real irony, to me, is that the author is doing exactly the opposite of a real P, in declaring a superior approach. (She says she's not, but that's a big ol lie.) But that's pretty typical of many in the rationalist community--they are overly fond of pretending they are logical (hyper T) but not really as flexible and decision resistant as they like to imagine themselves (that is, they're a lot more J).

I'm deliberately using MB typology to mock her because I mean, Jesus, Scout and Soldier is so fucking high school. But seriously, very little of this is original. Every personality model has this dichotomy in it. Again, they don't declare winners.

Thank god we aren't living in a world run by people like her.

Expand full comment

Longtime fan of Scott. This "review", however, reads like an endorsement rather than the very critical reviews that Scott has written in the past.

I wish Scott had considered the counterfactual "would I have been so complimentary towards the book if I was not already a Rationalist?"

Expand full comment

> You’ve probably heard the probabilistic (aka Bayesian) side of things before. Instead of thinking “I’m sure global warming is fake!”, try to think in terms of probabilities (“I think there’s a 90% chance global warming is fake

things don’t work like this though. What’s the probability that the langlands conjectures are true? Does that probability really mean anything or help you solve it? No, lots of intuitions and heuristics can help understand your directions and reasons to investigate, but there’s no probability there. What’s the probability that true love conquers all? What’s the probability you should be a programmer vs a doctor? What’s the probability of AI X-risk? The last one is a totally useless number because it pretends to reduce the actual technical difficulties and complexities that we don’t know yet about AI to a probability that we can’t know what it means or how to calculate it. Same frankly for “probability of global warming being fake”, no, the problem is what global warming is and what it’s effects will be and who and how it’s being investigated, and what one can intervene with it to change, none of which a probability is relevant to. It’s a distraction and a meme

Expand full comment

> Afghan soldiers

While the Americans ... well ... American soldiers absolutely did question the value of the war ... as did the Vietnam soldiers, see the origin of “fragging” - blowing up commanding officers with grenades ... uh anyway, while the Americans thought a little bit about the value of war, the afghan soldiers thought a lot about it, such that they regularly changed sides depending on local conditions, as they has for many decades. And when America left, this contributed to the immediate collapse - elders simply became taliban.

I’m not sure “soldier mindset” is a real thing tbh

Expand full comment

> So for example, if a Republican politician is stuck in some scandal, a Republican partisan might stand by him because “there’s no indisputable evidence” or “everyone in politics does stuff like that” or “just because someone did one thing wrong doesn’t mean we should fire them”. But before feeling too sure, the partisan should imagine how they would feel if a Democrat committed exactly the same scandal. If they notice they’d feel outraged, then their pro-Republican bias is influencing their decision-making. If they’d let the Democrat off too, then they might be working off consistent principles

But Republican and Democrat are still brothers.

Let’s say your wife steals from a store. Let’s say the guy who assaulted your child steals from a store. Or even better - let’s say France breaks a nuclear treaty. What about Russia? China? North Korea? The expert doctor treating your cancer makes a rude comment at your wife - vs a random nurse - different situations! Even in the case of politicians, a generally honest and faithful Republican making a slip up in the men’s bathhouse may look different from a Democrat abortionist - or - a deceitful, conniving Republican taking money from an oil company may be different from AOC, champion of progress, doing so. Or just a friend vs non friend doing the same thing can be very different for good reasons. They legitimately are different circumstances, and demand different responses! And I’m not sure that simply calling to abstract over it with a few simple ideas captures either why people do these things or explains why they mar sometimes be worthwhile? Hypocrisy is bad because it means a mistake is being made somewhere, not because it’s hypocrisy - “soldiering” is bad when it’s dumb, not when it’s driven by strong beliefs in a thing

Expand full comment
founding

I don't believe that intel story is true: They were working on microprocessors simultaneously to developing an update to their memory chips, which were extremely competitive. They pivoted to microprocessors because they saw immensely greater potential, which had little or nothing to do with the Japanese.

Expand full comment

Okay, In principal I’m completely down with rationalism. In endless chart and stat citing, truth table displaying, hair splitting practice, it takes on an aspect of brow beating IMO and makes me a bit weary at times.

Expand full comment

Now that we've read the review, do you think the actual book is useful for people who are in say the top 20% or so of effectiveness among rationalists? My friend said it was more useful for normies than the in-group.

Expand full comment

It's fascinating how scholastic this all this, though -- the enormous attention paid to the quality of one's reasoning, the use of dialectic, really this could all be straight out of a closely-argued 14th century monastic treatise on philosophy, or on how to divine the Will of God.

Over here in the shaky landfill soil of Empiricist Land, sinking steadily into the Sea of Stuff About Which Nobody Cares by 10cm every day, alas, we would instead hammer away at evidence, evidence, evidence. Learn not to care about the quality of the argument, either yours or your opponents, and instead search obsessively for the objectively measureable fact, the cold hard numbers, photographs, or dead bodies to throw onto the scales, and then take and stick to the difficult inhuman decision that an ounce of ugly measurement outweighs a megaton of beautiful theory.

But it makes sense. Empiricism is deeply unnatural to the human spirit. We are forced to it only when we work repeatedly with real systems, and our lives depend on deducing what natural systems that we did not design will do -- whether the weather will be fine or rainy, whether the steam engine will stutter to life or explode, whether the airplane will rise or crash into pieces, killing us all. Under those cirx, we turn to empiricism in dread and despair to save us from our ability to self-bullshit.

But in the modern world, few of us work and live like that. Our success or failure derives far more from social forces -- do people like us or not, applaud or hiss, buy our service or not, vote us in or out of office? Under those circumstances, yeah, the quality of the argument and the conformity to enduring social myths is way more important. It's a little like living in the Church circa 1350, where whether you are sold into slavery or become wealthy and powerful has everything to do with the good will of all the other bishops and nothing much to do with whether a stone arch you build stands up or falls down. Strange, that we should be retracing our intellectual evolution this way.

Expand full comment

"Galef is [CFAR's] co-founder and former president, and Scout Mindset is an attempt to write down what she learned."

This seems not quite accurate to me. I believe that when Galef left CFAR it was because its techniques didn't seem to work that well and she was disillusioned with the approach. CFAR's curriculum has changed a lot since Galef was there, and avoiding confirmation bias has never been more than a small part of the curriculum.

My understanding is that CFAR still doesn't have very good evidence that their techniques work.

I've only read reviews of The Scout Mindset so take this with a grain of salt, but my impression is that it does a good job establishing that there are indeed two mindsets of the kind she describes and it'd be good if more people were in scout mindset more of the time, but that there's not much evidence yet that the suggestions in the book actually work.

Expand full comment

Walk a mile in someone else's shoes...sure this is very old and kindergarten level advice. And?

I also think this is silly and has limited to no utility for most people who don't need to engage in this level of neurotic navel gazing.

If we are taking other people's views and lives into consideration....it is good to process things through their lens too....otherwise we're just forcing our ideas into our made up version of other people in our heads to have endless looping arguments. Did I mention this was neurotic already? And we want to spill this out into the rest of the world for everyone?

People like groupthink, they like not thinking about anything at all, they like knowing what to do.

They don't care about being right in the end, they want to be accepted by the people around them.

Sure, this has led to many atrocities, but it is also the grease to keep day to day life moving along most of the time without everyone sending each other long winded arguments and apologies on twitter in some libertarian fantasy of fully awakened and sovereign individuals...which has failed to gain mass appeal thus far.

People want to be lazy! Not negotiate every little detail in their life like making sure the fire department comes or their bank isn't shafting them or their food and drugs are safe to eat and take. Why can't we all just be ever vigilant experts in all things adjudicators for everything all the time? Why isn't this idea more popular? Hmmm...(I've got my flak jacket on for the blowback on that line!)

This can be seen as taking academic and ivory tower neurotic mentalities and norms of arguing everything to death into the real world where people are busy going to work, hosting BBQs, and just getting by until they can get back to their Netflix series. What possible reason should people bother to engage with this when they are a landless peasant held in debt quasi-slavery?

Did they not bow properly after the herald announced them or wear the right colours of the right material upon entering the court in the Spring period? These ideas about confirmation bias sound like courtesan etiquette nonsense to the average person.

A few higher ranked TED 'though leader' barnacles on the side of the yacht's of the billionaire neo-aristocrat oligarch class will yammer away making detailed rational arguments (Now with emotional appeal! wink wink) which will probably get ignored for political purposes to empower or enrich someone who is already powerful and rich.....and regular folks don't need to worry about this stuff.

They'll just get by, people with accept or reject them, and you just suck it up and move on! Your friend was rude...maybe he was hungry when he said a mean thing...but the friendship is good....if he keeps being shitty, maybe I'll hang out with him less or not at all. Problem solved! No need for 50 comment long longwinded arguments on twitter!

Expand full comment

My results on http://confidence.success-equation.com/ (for the sake of comparison with others):

Calibration score

Mean confidence: 65.60%

Actual percent correct: 62.00%

You want your mean confidence and actual score to be as close as possible.

Mean confidence on correct answers: 69.35%

Mean confidence on incorrect answers: 59.47%

You want your mean confidence to be low for incorrect answers and high for correct answers.

Quiz score

31 correct out of 50 questions answered (62.00%)

15 correct out of 30 questions answered with low (50 or 60%) confidence (50.00%)

10 correct out of 14 questions answered with medium (70% or 80%) confidence (71.43%)

6 correct out of 6 questions answered with high (90 or 100%) confidence (100.00%)

Expand full comment

Are the grammatical mistakes in the excerpts a transcription error, or is my heuristic "anyone who cannot manage to write mostly-correctly in a professional setting is probably not too smart" itself in error?

I mean, there *are* exceptions, especially in ESL cases. Still, I'm kind of surprised if it turns out these are present in the original. "All of which may be our beliefs are 'undermined'"? *Less* substituted for *lest*?!

Expand full comment

“ Like - a big part of why so many people - the kind of people who would have read Predictably Irrational in 2008 or commented on Overcoming Bias in 2010 - moved on was because just learning that biases existed didn’t really seem to help much. ”

Well that’s the sort of generous interpretation we all expect from Scott.

It’s not the interpretation I, much less generous than Scott, put on how this played out.

Remember all those theories of which would win in a conflict between class loyalties and national loyalties before WW1? How did that play out?

Yeah, now run the tape again substituting rationalism for class loyalty, and identitarianism for national loyalty…

Expand full comment

Maybe I missed it but I expected to see some acknowledgement that people can be scouts on some topics and soldiers on others. No one is immune to this.

Expand full comment

I like the idea of the scout midset, it helps a lot when trying to seek the truth and not be trapped in your own views just by the shame of being shown wrong. However, it misses one important point (like most of discussion about cognitive bias): Many times, the real value of "facts" or argument points are not their truth, it's their usefulness in creating an effect on your listeners (making them changing their behavior). Even if they are false, it can benefit to believe in it if it makes you more efficient in influencing others.

Let's call that the Machiavelian bias, and I think it trumps confirmation bias by a lot. Scout really helps confirmation bias, but as soon as facts are used to influence, it's how you like the direction of influence that will makes you believe or not the fact. Hence the famous Upton Sinclair quote “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

The best way to up the chance of really considering the truth of a fact (scout mindset) is to remove the link with the desired/feared policy change, either because it's relevance is trumped by other facts, or because it becomes compatible with a very different policy change...Both of those reduce the usefulness of the fact for supporting the controversial goal, which improve it's truthness value relatively. But then you will see how most people are interested in facts that do not help support a policy - very little :-)

Expand full comment

I think this is a great approach when paired with properly identifying what information ecosystem you’re pulling data from, and if that ecosystem is stale, fresh, etc. I convinced the the latter is the most important practice for critical thinking and rationality. This gets discussed via highlighting “one’s bubble,” but there’s more value and content to info ecosystems than that. It’s more about applying tools to systematically parse one’s information flows, and identify their limits, sources, and so on. This practice as a professional tool shows up in intelligence circles and I think it provides a solution that most closely mirrors why it’s hard to think and act rationally now - information throughput is too high to intuitively parse, and the primary data source (the Internet) works in loops for most users who try to navigate it - news page to social media to Wikipedia to news page to…

Scouting mindset paired with information source awareness and management seems powerful.

Expand full comment

“This is the bias that explains why your political opponents continue to be your political opponents, instead of converting to your obviously superior beliefs. And so on to religion, pseudoscience, and all the other scourges of the intellectual world.”

Is it really fair to say religion is all confirmation bias? Or that it is a scourge to the intellectual world? At bare minimum, religious tradition has carried forward a tremendous number of socially useful norms and distilled wisdom in its scripture.

This is not an unscientific position to hold. Joseph Henrich’s research characterizes religion as very prosocial for example.

Expand full comment

I'm kind of let down the review was not more critical of the book, just because I usually enjoy the contrarian takes on books most (when it coms to SSC)

But as someone who never bothered to read it, I'm glad to know it's generally speaking a "rational" take on the movement that doesn't get too extreme.

Expand full comment

I think the "there's no such thing as telepathy" study may not be the best way to catch someone being biased.

As someone who does NOT believe in telepathy, My first reaction to that was "How do you even begin to test this?", followed by doubts about whether testing if there is "such a thing as telepathy" is really what happened, and curiosity about what they tested and why it rules out all possibility of telepathy.

I think the core of the problem for me is that I can't really conceive of a way to test if there is such a thing as telepathy, so I have doubts about that study even though I'm biased against telepathy existing.

Expand full comment

The cheesy story made me smile. Adults needing ten years to get to the point of "engage to marry"? Not very rational in my book (no rational objection to "not-engaging and not-marrying at all". But with a life-span of less than 350 years, why do decades of delay?! ). Who wants to write the book: "I kissed good-bye to go for kids till shortly before menopause - and other cruel facts of life"? - Sounds like I will actually buy Julia's book, thanks for the review!

Expand full comment

I won't defend most Imperial units, but Celsius is worse than Farenheit for US weather. <a href="http://lethalletham.com/posts/fahrenheit.html">The 0-100 range of Farenheit closely matches the range of outdoor temperatures in the United States</a>, and Celsius compresses that range to about -17 to 37 degrees.

(Trivia: Did you know that one degree on a Farenheit scale corresponds to the temperature change that causes liquid mercury to expand by 1 part in 10000?)

Expand full comment

There is an idea floating around in my mind about books/writings like this which makes me think that they generally have the wrong idea.

Suppose you are trying to determine if something is true. In order to even reason at all, we need to have some ground rules for what counts as good arguments, which assumptions are valid, and how to reason given valid assumptions.

In something like physics, this doesn't really matter, because there are generally accepted ground rules for what counts as good physics. Nobody is going to start telling you about how things move the way they do because of their telos, and if you show someone a good mathematical argument backed up by a solid experiment, almost anyone will accept it as true.

Social reasoning, which is where this book seems to be mostly pointing at, do not work this way. There is endless debate about what counts as good methodology, and even afterwards we have things like the Sokal affair and the replication crisis, and don't seem any closer to a universal set of ideas about what constitutes good social reasoning than we were many years ago.

In light of this, your ideas about what counts as good social knowledge are far more influenced by your assumptions than specific arguments that you hear. This explains confirmation bias; if you have certain assumptions about the world, and want to extend them to their logical conclusion, it doesn't make sense to read things written from someone with too many different assumptions. In your own framework, their reasoning is wrong! This also explains why we sort into two very distinct ideological groups, where some seemingly unrelated ideas can have highly correlated beliefs. If assumption A implies 1 2 and 3, and assumption B implies 4 5 and 6, then 1 2 and 3 will be very correlated, and so will 4 5 and 6, even though they may a priori appear unrelated.

In light of this, consider the idea that it is virtuous to want to change your mind. In this context, something like that is not sufficient. If someone with assumption A reads a rock solid argument for believing 4 instead of 3, and changes their mind accordingly, they may still have very little in common with someone with assumption B. What really matters is if you are able to change your mind about assumption A.

This is kind of like the post on trapped priors; we have direct sensory data, and use our understanding of the world to interpret it. The closer something is to sensory data, the easier it is to make clear arguments about it, and the more likely there is that something can be demonstrated very precisely, effectively, and convincingly. In this case, changing your mind from the correct perspective is kind of silly. When there is a lot of disagreement, or in other words where this book is claiming to be useful, is where we have layers of interpretation on that sensory data. The high layers of interpretation are what inevitably will determine your conclusion, even though at the surface level they are not even the thing being debated.

Because of this, changing your mind about an isolated idea isn't really important. If we really want to understand our intellectual opponents, it is more important to figure out what assumptions they are making than it is to understand particular ideas that they profess.

This is not to go postmodernist on you though. There are some assumptions which are worse than others, and once you understand assumptions it is still possible to demonstrate things about them. It's just that I worry this book has too narrow of an understanding of what it means to change your mind. Whether the Golgi apparatus exists seems to be more of an object level fact than an implicit assumption, or whether climate change is real, or how you interpret studies about telepathy, etc.

Expand full comment

Thanks for the excellent review. It deepened my appreciation of the book. I am often frustrated by JG on the podcast - I found her print voice (in the sense of authorial voice, not tonally) to be more convincing. On the pod she tends to work things out in her head in real time, which can leave the guest waiting, waiting…. That said, I have a bias against overly wordy interlocutors (that’s a good name for a band).

Also, who cares about Scott’s throwaway line about changing his name. Zoinks!

Expand full comment

One thing about behaving like a soldier is that, for a lot of people, me included, it's really fun. I like trying to destroy my interlocutor and force them to surrender! And this motivation can at least sometimes make you better. It's similar to how it's more fun (and often results in better performance) if you try to smash the other team during even your neighborhood pick-up game of basketball on the weekend, instead of playing with the "let's bask in the social-cohesion-building aspect of sport" attitude.

Of course, you have to be able to switch mindsets, so that you all have a laugh and a sandwich together afterwards instead of plotting to Tonya Harding the other players later that tonight, but it seems that at least debating like a soldier shouldn't be dispensed with entirely...maybe?

Expand full comment

"You tried Carol Dweck’s Growth Mindset, but the replication crisis crushed your faith. You tried Mike Cernovich’s Gorilla Mindset, but your neighbors all took out restraining orders against you. And yet, without a mindset, what separates you from the beasts? "

I tried Bronze Age Mindset, and I got 30 years to life.

Expand full comment

What's so bad with confirmation bias? Without it there would be no stable identity, no bonding or binding ties- no civilization, no laws, no Science, nothing.

It may be that about ten percent of the population has an excess of the thing- but this isn't a problem provided the rest of us get mechanism design right. Another ten percent may have too little- e.g. me thinking I can twerk like Beyonce and that that it would be a good idea to sell up and head for Hollywood.

Expand full comment

Re. "They act as good Soldiers for Team “we’re definitely going to make a billion dollars”, and that certainty rubs off on employees, investors, etc and inspires confidence in the company": My own anecdotes are counter-examples.

- Doug Lenat tried to recruit me for the Cyc project way back in the 1990s, offering, I think, $35,000 / year plus stock options. I was excited about it until I asked how much outstanding stock there was, did some math, and told him, "The company would need a $1 billion IPO just for this to be decent pay." He said something like, "Of COURSE the company's going to have a $1 billion IPO!" He was so confident in the company that people who worked for him ended up screwed financially.

- I went to another startup instead, where the same sort of thing happened. The founder had been psyching us up with talk of getting rich. When we discovered he'd been so confident in the company's future that he'd allocated just 1% of the company's stock to be divided up as options among all the first-year non-management employees, it devastated morale.

Expand full comment

Maybe there would be fewer "soldiers" if there were fewer "wars"?

The more people want to run the lives of others via politics the more things are controlled via politics. Politics is war. When you insist on forcing your way on others, you need soldiers on your side and the other side's soldiers get mobilized. Big government creates soldiers.

Expand full comment

I think Scout vs Soldier (or Conflict vs Mistake) concepts obfuscate the simpler, fundamental description.

What's described as Solder (or Conflict) is really just "treating instrumental goals / beliefs as core values". Presumably, there's a whole lot more agreement on these core values, regardless of politics. If there's not, Scout (or Mistake) mindset isn't that helpful in reaching an agreement anyway.

In that read, Soldier / Conflict mindset isn't just an alternative to the Scout / Mistake mindset; it's an error. Through still, there are some advantages - like decreasing complexity, increasing inter-tribe cohesion (since everyone agrees on a higher level than if they just agreed on core values and each member tried to compute valid beliefs from scratch).

Expand full comment

"For example, I sometimes feel tempted to defend American measurements - the inch, the mile, Fahrenheit, etc. But if America was already metric, and somebody proposed we should go to inches and miles, everyone would think they were crazy. So my attraction to US measurements is probably just because I’m used to them, not because they’re actually better.

(sometimes this is be fine: I don’t like having a boring WASPy name like “Scott”, but I don’t bother changing it. If I had a cool ethnically-appropriate name like “Menachem”, would I change it to “Scott”? No. But “the transaction costs for changing are too high so I’m not going to do it” is a totally reasonable justification for status quo bias)"

Wait - what defense could you have for the Imperial measurement system <i>other</i> than transaction costs? I think that's a good defense, btw, but I can't imagine any other.

Expand full comment

I think one of the valuable things about arguing politics on DSL is seeing the same kind of arguments and opinions you have made with the polarity reversed. It really helps provide a sense of perspective and make you wonder how much what you believe is actually general principles versus based on what would help your side.

Expand full comment

Does anyone own the Kindle edition? Trying to decide which edition to buy. Graphics tend to suck on a Paperwhite.

Expand full comment

That is all well and good but how many reconnaissance units does an army need? Someone has to do the heavy lifting.

Expand full comment