Love the humility and intelligence in this post. There should be more on the internet like it.

Expand full comment

I'm not sure where I stand (I value your time and wonder what is the best use of it) but I have had bad experience with conspiracy theories.


And the friend is the comment on that post. <sigh>

Expand full comment

Ivor Cummins just did a 43 minute video on determining Conspiracy Theories vs Actual Conspiracies. He has a 6 point list of requirements for something to be an actual conspiracy.

Expand full comment

The more I read Scott's writing about this, the more I become convinced that there simply is no natural category of "conspiracy theories." There are simply theories, with better or worse epistemic support.

Yes, some of those theories have extremely bad epistemic support, and we should certainly learn to identify and reject them. But that's not primarily by categorization, i.e., dropping them into a category labeled "conspiracy theories." Rather, it's by applying epistemic criteria to the content of the theories.

Expand full comment

The link in the sixth paragraph of part three appears to me to redirect incorrectly

Expand full comment

Good response. Aaronson’s example is nothing all like conspiracy theories. When people email him proofs of hard unsolved problems, he’s not ignoring them because they’re conspiracy theories or because he’s worried he’ll attract attention to the writer. It’s simply because he knows the probability of someone solving it whose name he doesn’t know is vanishingly small, and since the topic is hard, responding to them all would basically consume his whole existence.

Expand full comment

Is your writing workflow configured in such a way to have reasonable checks against short-term emotional bias influencing the quality of your work?

Expand full comment

Question: is masking a conspiracy theory? One is routinely denounced by The Science (tm) and the other routinely enforced by The Science (tm), but they have roughly the same amount of evidence (see Vinay Prasad's writings on this).

Expand full comment

The "Should I even give them attention, or would that just be signaling that they're worthy of attention?" question has a pretty straight forward solution: do they actually have your attention, or are you too busy thinking about more important things?

If you have to talk about not giving them attention you are already giving them attention, and doing it in a way that shows unwillingness (or inability) to get into the object level arguments, so it's not a good thing for your credibility. If that's what's tempting, then it calls for some honest reflection.

Expand full comment
Feb 15·edited Feb 15

I just don’t get any of this. I have no interest in defending the efficacy of Ivermectin. I don’t think it works. But of COURSE there’s nothing wrong with analyzing deeply whether it works and looking hard at the evidence on both sides. It’s a pharmaceutical product that a bunch of studies suggested might work. This isn’t the flat Earth. I get the argument that there are some really extreme views that some might consider so crazy that it doesn’t make sense to engage them but this doesn’t seem anywhere in that galaxy.

Expand full comment

Noticed two typos:

> Conspiracy theories are deadly traps lying that lie in wait for you

> And Pierre Kory MD, an specialist

Expand full comment
Feb 15·edited Feb 15

While I'm not as smart as all y'all, I'm a little smart, but also lazy and easily distracted. These posts of Scott's are endlessly helpful to me as I steer my way through my own tendency to accept science at face value and not question conclusions. If I'm a member of the target audience, and I think I am, Scott helps me to be a better thinker and a more honest human without asking me to work very hard. I appreciate that a great deal.

ETA: Apparently I'm on my husband's membership for this site. I'm not TenaciousK, but am often DawnCoyote

Expand full comment

My personal tactic is to defer to what people around me believe and accept whatever they say, but let myself break from it if I have accumulated enough evidence and done enough due diligence.

How much evidence and how much diligence? That depends on how far away my belief is going to be from the consensus belief. If it's a slight twist on a consensus belief, not much. Something moderately different means there should be a pretty good body of evidence and I should have spent a good chunk of time thinking things through.

But if it's the total opposite of what everyone around me believes? That requires a lot of evidence and a lot of thinking, significant understanding of arguments for the mainstream position, and really being able to answer every possible counter-argument. Given how much I like to talk about my beliefs, it's the only way to avoid looking like a crank or a fool.

Like Scott's recommendation, this definitely biases me towards believing the mainstream consensus version of things for most things. But life's too short and I'm going to be wrong about a lot of things either way, and I figure this is about the best I can do.

Expand full comment

I thought there was a lot of truth in your original post on fideism as well (and for it's worth, I enjoyed the ivermectic posts quite a bit). Where it erred, I thought, was mostly in how unfair it was to the anti-conspiracy side. For example:

> If there’s some argument I know nothing about - pro- vs. anti- skub, perhaps - and all I’ve heard is that the pro-skub people say that you should look at evidence and decide rationally based on your best judgment, and the anti-skub people say you should never look at evidence and have to trust them - I’m already 90-something percent sure pro-skub are the good guys.

The thing is, anti-conspiracy folks don't say this. They say, "You should trust the expert consensus, because resolving technical disputes is hard, and the expert consensus is definitely sometimes wrong, but much less often than it is right." And you know this, which is why I don't think you would actually be 90-something percent pro-skub. There was a fair amount of this strawman-ish stuff in the post.

But, again, I liked the ivermectin posts, which were interesting as just a sort of epistemological whodunit, and in general I tend to agree with you on the value of engaging with weird arguments.

Expand full comment

A good conspiracy theory has both pattern and content. By the time people are adults most of us have seen or read about an authority figure which was incompetent, self-dealing or corrupt. So for a theory of the pattern “x authority is actually incompetent,” many people are already open to believing that at least a little because they’ve seen it before.

Also, for something like the East Palestine disaster, it’s already yielding a fresh trove of arguments against believing experts. “They told me we were safe but then the pets died” feeds directly into “x authority is incompetent.” In fact it’s possible someone has never carefully looked at a scene of competent authority, so trusting the pattern feels normal to them. Making the populace not panic is (imho) usually a more important criterion than the welfare of any one person, for the experts, who are, I think, burdened with various panic scenarios. If only we could learn disaster response as a culture. So the experts could stop overweighting the consequences of panic. Hmmm.

Anyway, then whatever content is superimposed on the pattern becomes more believable to them. An observer can have very little knowledge of y, but if the x incompetent experts say y is good, they must be wrong. For example. A part can metaphorically represent the whole, so if a person has some examples of locally incompetent authority, therefore all authority is incompetent and ivermectin has a strong chance of being a miracle cure.

Some of the election fraud theories work this way. “Look, the election authority is incompetent (example), there must have been voting fraud!”

So a key reflex for resisting a conspiracy theory is the ability to believe an authority might be a little bit competent & non corrupt. If one can’t consider even 1% possibility that the authority is just doing their best under difficult conditions, resistance to the theory is unlikely.

This comment got way too long.

Expand full comment

From what I read , a lot of sophisticated people in the US, who despise conspiracy theorists and conspiracy theories (which is generally coded right these days) believed wholeheartedly in Russisgate.

Expand full comment

For anyone wondering about the chess set picture:


Expand full comment

Most of this discussion doesn't resonate with me on a personal level. It seems like it comes from the premise of "Believing false stuff is really really bad," when it seems like "ignorance or misunderstanding" is really the "default position".

If we interpret "conspiracy theories" broadly enough, then many "conspiracy theories" have been true. There were periods where many scientific theories, including cooky-sounding ones, were suppressed. There doesn't seem to be any "common sense" way to resolve these kinds of general issues. Many political, economic, philosophical, and scientific positions that are ubiquitous today began as fringe positions and I don't think this trend will end particularly soon. The lab leak hypothesis is an example of something that I'm uncertain about but which could be called a "conspiracy theory." In time, it's possible it will be classified as such or, if it's confirmed, people will change the story so that it was always the leading possibility. This is an issue I don't see as easy to resolve just by using good epistemic norms (if I did, I would try to do so).

So it seems like if we want to tackle "conspiracy theories" we might set out sights a little lower and treat it as a special case. It seems like 90%+ of conspiracy theories can be resolved using a few basic habits. Just ask, "If this were true, who would gain by suppressing it? Who would gain by supporting it? What tools do the relevant actors have to push their agenda?"

This seems to dispel most of what I would consider "absurd conspiracies", which includes vax microchips, pizzagate, the moon landing, Paul Mccartney's death, but not "difficult to decide" ones like the Lab-leak hypothesis, the drug war's racist origins, US support of Israel because of biblical prophecies, etc.

Expand full comment

I endorse every word of this!

Expand full comment

One problem with this post, and its predecessor, is that they are kind of missing the point on what the real disagreement is here.

There is an interesting question about the real discernment of truth, and tools like rationality for helping improve on this. But nothing about the ivermectin debate (in the popular consciousness) bears on this. The ivermectin issue is not part of the rationality, discernment-of-truth debate; it's part of the what-is-permissible-to-believe-in-public debate.

For issues like this, subtle questions of rationality and study design are irrelevant. The only relevant questions are what powerful factions are in control of the respectable opinion on the matter, and what that respectable opinion is. Speaking a non-respectable opinion, or denying the respectable opinion, are forbidden, regardless of the truth either way. The "we must deny oxygen to the dangerous conspiracy theories!" response is diagnostic of this class of debate: treating the debate as if it's an ordinary matter of truth-discernment, and addressing the evidence on either side, amounts to denying the authority of The Experts to determine the respectable opinion -- a species of lèse-majesté -- and thus is equally condemnable as actually propounding the "conspiracy theory". And in this context, "trust the experts!" does not mean "identify people with expertise on the subject and trust them"; it means "agree with the respectable opinion". ("Expert" here comes to be defined in reverse as "anyone who propounds the respectable opinion"; people who differ with the respectable opinion are by definition not experts.)

(The race-and-IQ question is a good example here; it is an issue where the best available scientific evidence, and the actual experts, mostly point one way, but the respectable opinion you're allowed to believe in public is precisely the opposite. One can see how impotent mere expertise and evidence are in the face of a power faction determined to suppress them.)

There are issues where your heuristic of "almost always trust the experts" is a good one. Most plain scientific or technical questions that don't have a major presence in the public debate fall into this category. However, if you're dealing with an issue where a power faction has dictated a respectable opinion, then the heuristic becomes worthless. Whether the officially-defined Respectable Opinion is actually correct is more or less up to chance; going along with your heuristic here just makes you into an obedient subject of the power factions.

For various mostly-stupid reasons, the question of ivermectin efficacy (and hydroxychloroquine, and vitamin D/sun exposure, and all the other theories of covid treatment) has always been one with a set respectable opinion. Thus, trusting the experts is not a viable heuristic here. And if you try to use this as "practice in forming your own opinion" -- under the presumption that if you come to a conclusion contrary to the respectable opinion then you're in error -- then you'll just warp your epistemology to become a more obedient subject of the controlling power factions.

Expand full comment

"The answer is: absolutely, yes, but also this is how conspiracy theories get you." This isn't just how conspiracy theories get you, this is the ideal human experience, to pose the hypothetical in our mind like a grand fiction so that we can demand of ourselves to create an understanding of these people, this world, ourselves. The conspiracy theory is the undemarcated indulgence in being human, to constantly seek to know, and to have that beautiful, torturous need throw us into passionate order and fixation at a cornucopia of childish and wildly wise levels of knowing...the conspiracy theory is as fundamental as theory of mind, as learning mating rituals. To belive is our mother tongue, to strive towards meaning with our will to illusion is the gifts of endless puzzlement that disappear the chains of our determinism. We are fated to believe, and we cannot nor ought we dissuade perseverance in the face of ruin--unless we believe that signal blazes before the stormy rocks out to the quiet ghostship filled sea of our hopeful need for conversation.

What truly is evidence anyhow?

Expand full comment

We have seen crypto speedrunning modern finance and now it seems we are seeing science speedrun the development of the efficient markets hypothesis.

Kavanagh is a Boglehead.

Expand full comment

I've seen it argued that a lot of the appeal of "conspiracy theories" (or just heterodox views in general if we wish to avoid the categorization issue of what counts as a "conspiracy") is that it makes the holder feel special.

It seems like maybe the best defense against conspiracy theories might just being confident in ones own value/personality with needing to ground it in, say, having non-standard views on the Kennedy assassination.

Maybe less Grassy Knolls and more Touching Grass, in short.

Expand full comment

Arguably, believing even the most transparently false conspiracy theories isn't measurably bad for individuals or society and it doesn't matter what people think. If someone wants to think that COVID isn't real or that Donald Trump wants you to drink bleach, it's not going to meaningfully change anything.

From this perspective, just doing what we can do silence or ignore annoying people is basic hygiene. Only the unusually altruistic (like Scott Alexander) actually care enough to want to help people think better.

Expand full comment
Feb 16·edited Feb 16

You mention the flaw with the heuristic of 'trust experts' (namely, that you might be wrong about what experts think, or someone else might be deliberately misleading you about what experts think), but not the equal and opposite problem of the heuristic 'ignore conspiracy theories'.

To state directly what you hinted at in the post: People who don't want you to believe something will describe it as a conspiracy theory, regardless of what qualities it has or does not have.

The lessons you've attempted to impart with these last two posts (be humble; be vigilant) are important, but if I could add one more, it would be to follow the arguments as much as you realistically can, not because every theory deserves the time of day, but as practice for when you have a real question that you can't just throw your hands up and say 'no opinion' to.

Sometimes an argument will make sense, and a counter-argument will be above your head. (E.g. this pyramid looks artificial; it ends in right angles at regular intervals. And the counterargument being a bunch of complicated geology that I won't even pretend to understand). That doesn't mean the argument has to convince you.

But there's a difference between being persuaded by a counterargument that you know exists that you don't fully understand, and being persuaded by a counterargument that you *assume* to exist, because you didn't take the initial argument seriously in the first place.

There are loads of 'conspiracy theories' that I don't give the time of day to, that I'm sure have at least one or two convincing arguments I haven't looked into, but I don't go around saying 'no one should look into this', because someone has to, there is literally no other possible way to know which is correct.

I didn't read any of your ivermectin posts (I didn't care about the question one way or the other, and a lot of it was over my head), but I appreciated that you took the claims seriously and addressed the arguments rather than taking the completely absurd stance of 'no one has to look into this to know it's false'. And I would have appreciated an exhaustive post debunking the moon landing deniers' strongest arguments just as much (and probably read the whole thing with a bucket of popcorn), because those posts are how truth is discovered and transmitted to others.

Kavanagh's sneering is wrong on two counts. First, he's far more certain that ivermectin doesn't work than the evidence justifies (I don't think it does, but the evidence in favor of it is a hell of a lot stronger than the evidence in favor of moon landing denial). Second, he's doing the very same thing the worst conspiracy theory believers do. He's saying 'I believe [x] because a seemingly trustworthy person said [x], and you should also believe [x] and if you don't, or if you take arguments for [not x] seriously than shame on you.'

You, Kavanagh, and conspiracy theorists all have one thing in common. You're claiming to have found truth, or at least made an honest and intelligent effort at it.

The difference between you, and them, is that Kavanagh and conspiracy theorists are saying 'I've found truth, it's high status to believe what I believe, and low status to believe otherwise', and you're saying 'I think I've found truth, here's a detailed map of how I got here'.

You might be right. You might be wrong. You might have made some substantial errors, or been biased. But no one, at all, even theoretically, can know that without someone having looked at your map.

And no one can know that the pro-ivermectin people are wrong, or biased, or making substantial errors, until someone has looked at *their* maps and said something to the effect of, 'oh, I see what you did here, you took a left at Albuquerque, but you should have taken a right'.

The notion that by taking bad arguments seriously enough to debunk them you're making the problem worse only makes sense if the bad argument is so obscure that most people won't have even heard of it until you debunked it; and it's very convincing; and your counterargument is either unconvincing, hard to follow, or both; such that people will come away convinced of the very thing you were trying to refute.

You definitely haven't done that with ivermectin (although you may have convinced some people that Atlantis is real).

Expand full comment

Would this mindset have protected you against error in 1923? In 1823? In 1023? 2123?

Does it seem likely or not that experts in 2123 will look at 2023 experts the way 2023 experts look at 1923 experts? Does it seem like the amount of progress and change there was from 1923 to 2023 is going to be greater or less than what will happen from 2023 to 2123?

Expand full comment

> I’ve tried to be pretty clear that I think experts are right remarkably often, by some standards basically 100% of the time

Maybe “right” means a proper evidence based opinion from Bayesian analysis with some confidence interval? What are you supposed to do if you have differing priors, values, or goals? How do you even reconcile “experts are right…” with the existence of this blog?

Expand full comment

I have this unproven intuition that the only way to efficiently fight conspiracy theories is to put the fact outthere, eg: write/talk publicly about how geological phenomenon can form underwater pyramid, but do not aknowledge that you are doing this in response to a conspiracy theory, do not even aknowledge that there *is* a conspiracy theory. Act as if you're just genuinely mentioning facts for their own sake.

Expand full comment
Feb 16·edited Feb 16

I really relate to these last two posts, having been raised Mormon. Part of the reason it's so hard to escape fundamentalist religion is the vast majority of arguments you hear against it are really poor, just an insult, or a combination of the two. For example, at least half of the arguments Mormons will hear claiming Mormonism is false are from fundamentalists of different denominations roughly following the pattern "Of course you're right the Earth is only 6,000 years ago and god spoke to man then. And you're right about what God said 4,000 years ago when he again spoke. And you're right about everything he did 2,000 years ago when he came down and spoke to mankind in the flesh. But it's completely impossible that he spoke to man again 200 years ago and you're secretly a Satan worshiper for even considering it"

>Get a sense of what the arguments for the conspiracy theory look like

This was part of what really got me thinking on the right path. Going out and seeing what believers in other religions were saying--how they'd find all kinds of evidence obviously proving their religion true. And realizing that, yah, objectively, I'm willing to dismiss all those things about other religions and to be intellectually consistent I can't use similar things as proof of Mormonism (e.g., eye witness accounts of the supernatural, random miracles, spiritual feelings, etc.).

Then I was able to go through and start investigating things like Evolution and then the actual historicity of Mormonism. And yah, by that point it was all pretty easy to discard via standard Bayesian reasoning.

But initially, from the inside view, it did seem like overwhelming evidence and the fact that smart people just dismissed it with an insult made it so I could dismiss the "experts" because they obviously hadn't examined it closely and were just ignorant of all the truths I'd discovered that they were too prideful to listen to. (I've probably mangled the memory but I recall an old lesswrong post where someone tried to defend Mormonism and commenters patiently went through each point and showed how that wasn't dissimilar to evidences other religions use for their mutually exclusive religion... and seeing smart people who did consider the evidence carefully and charitably and still reject it was an major initial source of cognitive dissonance)

Expand full comment

What's important is that you should treat everything every sharp customer tells you as equally plausible. Especially things you read on blogs about topics like machine learning, statistics, human biological diversity, social justice, and gender. Remember: always do your own research!

Expand full comment

One thing that people tend to assume, which I’m not at all sure is generally true, is that believers in conspiracy theories believe in conspiracy theories.

I’ve mentioned this on here before, so sorry to repeat myself, but some years ago I attended an actual conspiracy theory convention in San Francisco, called (amazingly) ConspiracyCon.

I’m not saying there were no true believers there. And of course it’s guesswork to ask what they really believe. But many of the attendees did seem notably relaxed about their commitment to particular theories. The same people would be won over by Saturday’s lecture on how JFK was not assassinated, and Sunday’s about how he was assassinated by then-seventeen-year old George W Bush. The guy who seemed very intense about Israel causing the Haitian hurricane for some reason, was able to retreat to much less specific weather-control mutterings when informed by his friend that anti-Israel theories were leftist. It was all just very social: holding beliefs was secondary to seeming cynical and in-the-know, signalling solidarity and basically having fun with like-minded friends.

I think for most of us probably the majority of our beliefs are like this (self-conscious political / religious / philosophical beliefs, not beliefs like “If I walk out the window it will end badly). How many people who claim to believe that abortion is literally the same thing as child murder show any sign of behaving the way you’d expect someone who believed that to behave? Same with the anarchist and communist friends of mine who insist that they want to smash the system. When I was a kid I believed… I believed I believed, or stated to myself that I believed… in God, but I didn’t behave at all like someone who thought an omnipotent, highly judgmental and irascible being was watching my every move.

I’m not sure there’s much that’s particular about believers in conspiracy theories (though there may be something particular about the style of argument of their most successful proponents). Many of them have just been born into or wandered into a social milieu in which those are the kind of things it’s cool to “believe”.

Expand full comment

IMO everyone has a right to try and untie knots in their belief structure b/c no one else CAN just like no one can truly understand 1+1=2 or infinity without going through the process themselves.

This is important b/c otherwise the feeling of living in a Matrix becomes THE primary problem to resolve vs accumulating stockpiles of calories or wealth. “ur not smart & sophisticated” enough is equivalent to “just accept the matrix”. (Root cause of “wanted to escape matrix” likely explains a lot of issues and is similar to “inability to sit quietly in a room alone”)

On a spectrum of conspiracy theory -> widely accepted truth -> Truth of God, these tools resolve knots based on desire for understanding.

Conspiracy theorists just want all the info to be integrated, the arrogant believe they have achieved the god truth, others dare not approach that infinity and aim to just be less wrong.

The choice of tool depends on how strong the load bearing structures are for dissonance. (the greatest structures require reading a 25k word article every day or similar)

Conspiracy theories act similar to a True Belief in God or a touchstone to True Love, an infinite ocean that can dissolve any knot b/c contrast w such Vastness reduces any knot to an infinitesimal.

However conspiracy theories have to constantly integrate all the new information all the time. Like every codebase, it cannot outrun entropy and eventually collapses into a hyper dense black hole of stubbornness, sphagettifying anything that crosses its event horizon.

My personal conspiracy theory is that conspiracy theories are a shitty replacement for God that prevent an excess of desperate Matrix escaping acts like mass shootings or other “can’t sit quietly in a room” problems. The purpose isn’t to be “right” but to be useful for things like refusing-to-dehumanize-others-even-enemies and empathy-towards-the-personal-task-of-knot-resolving

Expand full comment

Sound advice, well written.

As a minor footnote, I do wish you'd added two more defensive mental positions that have served me well over time:

Measurement is king. Do as much measurement as you can yourself. Be more inclined to believe people who display a devotion to measurement. Measurement trumps any amount of beautiful theory, and it's about the only lasting bulwark we have about the universal human tendency to believe plausible pleasing bullshit. Give the world 10 years in which no one actually died and we would persuade ourselves that death did not exist.

If measurement is king, consistency is queen. Always ask yourself whether such-and-such hypothesis is logically consistent with all else that you know to be true, or have better reason to believe is true. If the FDA is corrupt in approving/not approving this drug, what *else* would have to be true about what the FDA does? Is this what you see? If ancient astronauts landed on Earth and taught Incans how to smelt aluminum and build helicopters, what *else* would be true about history? Is it? If it was fairly easy to discover effective antivirals -- half a dozen can be found just by rummaging around in the existing pharmacopeia -- then would AIDS have been so frightening for so long? Or hep C? Would people still be dying of influenza the way they do? Et cetera.

There are conspiracy theories that are logically consistent with everything you know -- but they're rare, beautifully constructed works of art. Often a quotidian conspiracy theory starts off small, but as soon you start wondering about its consistency, it has to grow bigger and bigger. Oswald didn't act alone -- requires one shadowy accomplice. But the cops and FBI didn't figure that out -- now we also require the police and FBI to be incompetent. Multiple blue-ribbon commissions didn't find any reason to credit the theory -- well, they're in on it to, the conspiracy is bigger than we ever thought! And so on. When your conspiracy has to get bigger and deeper every time you poke at even minor inconsistencies, that's a warning sign.

Expand full comment

CK's position: Trust the experts except when it's politically inconvenient.

Expand full comment

Now THIS is why we love Scott Alexander.

Expand full comment

There is something creepy about rationality. It makes me feel like that scene in Akira Kurosawa's Dreams where the ice demon is telling the guy fainting from hypothermia to "go to sleep", that "the snow is hot".

I don't think we need more rationality, collectively speaking. We need better ways of unleashing our passions. It's probably good that experts strive to be rational clear thinkers, but this stuff isn't for everyone. Brahmins, experts, can't actually rule either: it is the Kshatriyas who do that, and they are supposed to listen to the experts.

If there is one great sin of the rationalists, it's one from the Bhagavad Gita:

> So let the enlightened toil, sense-freed, but set

To bring the world deliverance, and its bliss;

Not sowing in those simple, busy hearts

Seed of despair.

The story of rationality is a seed of despair. "Submit to the experts, who are more reasonable than you". "Don't have faith, that's delusion." "Hope the Singularity sorts everything out". This stuff is poison in its present form. It would be much healthier to approach rationality as a tool you deploy in certain contexts (such as the one in this article, to be fair), not as a final answer to every last thing.

So if this is a proper context for rationality, why the triggering? Probably the picture of Dawkins did it. I just don't buy Dawkins or Yudkowsky understand reality more than Jesus did. And I suppose they have arguments for why they are, in fact, better than Jesus, even if they have never actually phrased it that way.

Rationalists do have a god, and it is a god of ice. I would rather follow fire, but I also believe the two can coexist. The ice would need to be subordinate though.

Expand full comment

I just try generally to not have beliefs.

Expand full comment

>I come back to this example less often, because it could get me in trouble, but when people do formal anonymous surveys of IQ scientists, they find that most of them believe different races have different IQs and that a substantial portion of the difference is genetic.

IQ scientists (mostly psychologists) are not exactly a reliable source for developing intuition on this question. But some people are pretty well wedded to their uh, priors here.

Expand full comment

It seems to me that you're using "conspiracy theory" to mean the same as "questionable or crackpot science." Unless the "conspiracy theory" is just about the efforts of "the establishment" to "suppress the truth."

It seems obvious to me that there should be a fair amount of tolerance for discussing "questionable or crackpot science." My go-to example is plate tectonics: until sometime in the 1960s or 1970s, plate tectonics was generally regarded as a crackpot theory. The incidental supporting evidence (the way the shape of South America seemed to "fit" into Africa, corresponding geological strata that seemed to appear on opposite sides of the ocean) was considered weak, coincidental, and unconvincing. The idea seemed to be crazy - the Earth essentially changing shape constantly. But, scientists kept debating, and eventually most of them bought into the crazy theory. One lesson: the establishment and experts can be wrong, and can unfairly trash people who are right. Another lesson: without continued open discussion, the right answer would never have become accepted.

This example is also notable in that the result had relatively little importance outside of earth science fields. The scientists had their arguments, and the rest of us didn't need to pay much attention. If we did pay attention, and we backed the "wrong" side, it didn't hurt us much.

However, if the issue had become attached to an issue with political salience, the political issues would have consumed the scientific question. If the "pro" side had been championed by people who urged the end of ocean shipping because ship turbulence could cause plates to fracture, or if the "anti" side had been championed by people who wanted to end oil drilling because it could cause volcanos, the scientific merits would have been ignored by the activists.

I think we saw some of this effect with respect to Covid and Ivermectin. There were people who were tired of the "experts" telling them what to do, and wanted something that would show the experts were no better than the ignorant rubes they tried to push around. These people probably picked the wrong issue in focusing on Ivermectin. There were plenty of other illustrations of the fecklessness of experts, and of the over-reliance on experts by politicians to mandate questionable policies in the name of threat reduction. In the case of Ivermectin and other Covid topics, I think the prudent medical approach would be to let doctors carefully explore many different clinical approaches, sharing their results and trying to spread the learning. Meanwhile, conduct good scientific tests to get "hard" information and share that too. Eventually, we'll get better information. Unfortunately, more people will die than if everyone used the "right" treatment from the start, but we don't know the "right" treatment, so the unfortunate result is unavoidable.

The policy implications on Covid topics turned out to be pretty large: How much economic lockdown should we implement? What protective measures should be mandated? Which risks matter more: teachers who face potential mortal risks if infected vs. students who face lost education if schools are disrupted? These are not scientific questions, but value questions involving risk tradeoffs under uncertainty. In many areas, politicians took the authority of "expert" opinion to impose measures of questionable value. In the process, personal and institutional incentives got distorted:

- Experts didn't want to confess ignorance and cede the spotlight to others who would express no doubts

- Politicians didn't want to appear helpless, so had a strong incentive to "do something"

- Government bodies had an institutional incentive to show themselves boldly taking charge in a moment of crisis, thus justifying their past funding and giving them arguments for much larger future funding.

I think we see more of this effect with respect to climate change. There is pretty basic science, well established for many years, that carbon dioxide is a greenhouse gas, and increased atmospheric carbon dioxide will lead to higher temperatures, all other things being equal. The activists have seized on this point, and some follow-on speculation by some scientists, to conclude that everyone will die unless we stop using fossil fuels very very soon. Too many of the other side started by questioning the science of greenhouse gases, or the temperature record, when they should have focused their efforts on the weaker parts of the activists' program: how much warming can we expect, and how much harm (balanced against how much benefit) comes from it?

There certainly are conspiracy theories around the climate change issue: Climate change is a hoax perpetrated by scientists who are trolling for funding, vs. Climate change dissenters are anti-science hacks funded by fossil fuel interests. Most of the scientific papers I've read on climate change are rather modest in their claims, but the authors are maligned by those who reject all claims, and over-interpreted by those whose agenda is anti-capitalism and anti-technology.

On this issue, the policy implications are potentially all-encompassing. There are experts all along the spectrum from "there is no significant issue" to "we must transition away from fossil fuels within the next few years or face catastrophe." The policy issue, though, isn't scientific at all, and no amount of expert judgment can solve it. It will ultimately depend on citizens informing themselves as much as possible, confronting the hard choices and tradeoffs, and being willing to live with the consequences.

All of which is to say I agree with what [I think] Scott is saying: we really need to try to understand these issues. We should respect the expertise of the experts, and try to understand their judgments. The experts owe it to us to take resistance to their conclusions seriously and engage them on the merits.

Expand full comment

Another problem with the "idiocy" heuristic is that you'll eventually encounter conspiracy theories that are true, and them your whole framework will be thrown into doubt and you'll be primed to think They Were Lying About Everything. I mean, Nixon really did interfere with the Vietnam peace talks for his own political gain. There were no WMDs in Iraq. Either Russia was secretly backing Trump's campaign, or the whole thing was a rumor from Hillary's campaign (a conspiracy either way!). If you treat conspiracy theories as some special realm of debate too stupid to entertain, and then you find out one of them is valid, you'll suddenly be wondering whether anything you know is true at all.

Expand full comment
Feb 16·edited Feb 16

Your advice to the young internet newbie doesn't have a single axiological ideal written into it. They're as neutral as laws of physics, leading people to believe whatever is the exact consequence of the advice, no matter their relation to ideals. I believe this is a mistake and I point to the example of our universe as evidence.

To build intuition of the character of amoral systems:


Expand full comment

Things like archeologists being wrong, mistakes being made about drug effectiveness, some government or corporation doing a cover-up don't really feel like conspiracy theories.

Expand full comment

This essay reads like it was written by someone who has never even heard of the replication crisis in expert publications. The advice that closes the post with is really terrible. As is the claim that "experts" are overwhelmingly correct on things. There are very few fields of expertise that genuinely have the certainty of the limited number of hard physical/engineering sciences. Huge ranges of academic fields -- social sciences, psychology, much of psychiatry and public health -- should not be taken at face value when they conflict with common sense. Even "hard science" research in medicine, such as pharma and biological research, is substantially corrupted by various misaligned incentives, to the point that our priors need to be skeptical on a broad range of medical claims. (I've been saying for years that Scott needs to read the great book "Medical Nihilism" by the distinguished philosopher Jacob Stegenga). Simply put, there is no substitute for learning to think for yourself and think critically.

Most "conspiracy theory" claims are attempts to manipulate social/emotional coalition dynamics and deference to authority to discredit stuff that might be true but for practical or political reasons people don't want to test. That of course doesn't mean that most conspiracy theories are true. It's at least as difficult to evaluate the truth of "conspiracy theories" as anything else. But naked efforts to use raw authority claims to exile perfectly sensible claims from the public sphere are absolutely everywhere these days, and a post like this which claims this is a simple issue is quite misguided.

Expand full comment

Suggested reading:

Michael D. Gordin, "Immanuel Velikovsky and the Birth of the Modern Fringe" (2012)


Suggested viewing:

Dan Olson, "In Search of a Flat Earth" (2020)


Expand full comment

I skipped to the comments after reading your 3 definitions of conspiracy theories, because I had a rare moment of synthesis.

I’m not sure I’ve heard anyone put it this way yet, but it seems like a formulation Scott might make:

Part of the problem is that there are two forces working against each other. One is that the most adaptive ideas are not always the most true, and in the internet age they can spread fast. The other is that there’s no place to stand to know what’s “true” ex ante.

That’s where we get experts labeling something misinformation one day and guideline the next. That’s one failure mode. Another is qanon being a really sticky, sexy worldview. Or Bret Weinstein’s anti-vax stances. Or the 1619 project.

Where this cashes out (maybe?) is that we shouldn’t blindly trust the experts, which is not the same as saying we don’t need trustworthy institutions. There’s simply too much information to keep track of to do our own research on everything.

Expand full comment

Chris Kavanagh (an anthropologist) co-hosts a podcast looking into the Guru-sphere "Decoding the Gurus", it is all done with good humour and gentle roasting. I wouldn't say he is an "anti-conspiracist", but he does show how the so-called Gurus tend to sow the conspiracy mindset as a way to promote their brand, boost their egos, and sell mountains of supplements. The alleged conspiracy itself is beside the point, rather it is a nexus of interaction between the Guru, the followers, and the wider discourse. The "rationalists" (of which Scott is a prominent exemplar) entered the nexus when attempting unravel the ivermectin thread, which has been an ongoing Guru talking point.

Expand full comment

You write too fast Scott! I'm still composing the response to your first post. From the three categories you offer in regards conspiracy theories I'd fall into the 'Intellect' understanding. I agree that ordinary heuristics can lead to conspiratorial thinking, that actual conspiracies exist, and that it is likely everyone, including very smart people, fall into conspiratorial thinking on certain issues. However, I would disagree that conspiracy theorists and conspiracy communities are just applying standard reasoning methods with some different priors/weightings. They are not. There is good work on this topic and I think it is pretty well documented that conspiracy communities promote rather special forms of logic/thinking that are unusually bad for reaching accurate answers. See the work of Stephan Lewandosky on what discerns "conspiracist cognition" for example.

Expand full comment

Here’s my rule of thumb: physicists like the simple and beautiful answer, but I’m not a physicist. Activists like the empowering righteous answer, but I’m not an activist. What I am is a limited human being in a very large world, so to me truth usually tastes like a meal not prepared for humans. Actually. I don’t find the truth for myself. I pick the experts who prepare the meal with the most apologies and caveats.

Expand full comment

I feel like the general idea of "don't bother looking that up, it's a conspiracy theory" maps to like, 20% or more of the arguments I see online in a way I can't figure out how to directly articulate. It sort of looks like "that's bad because it's word that means bad" and the response is predictably:

"only idiots believe in conspiracy theories, here's 500 well researched sources from lesser known experts about lizard men"

"it's not a scapegoat, the goat is guilty"

"I'm not racist, *I* didn't pick who decided to run the evil secret government trying to kill god."

"extremists are dangerous and that's why we need to immediately purge anyone who seems like they aren't 100% with us"

"I'm not entitled I just think I have a right to that thing you have"

I made those extreme, but the general structure is pretty much always going to be true for the person stating it. And sometimes the goat really *was* behind everything. But the internal experience of the person making the argument is going to be identical either way, and you knew the person believed in __ when they started arguing for it, so making the argument is usually somewhere posting a warning sign for observers and trying to bring the social shame of the bad word unto them through a ritual chant. The person it's being used on is more likely to go "huh I wonder what other conspiracies are true" than be convinced of anything.

And it's everywhere. You might be thinking of some culture war issue, but the exact same structure happens basically everywhere. i.e. In gaming discussions look at when people use negative phrases like entitled. It makes some sense when the word is a big culturally loaded negative word that you can't be seen associating with and get points for dunking on, but people do the nearly exactly the same thing for mildly negative words like "grindy".

Anyway, this was meant to be a question before I wrote all that. I feel like there's a name for this and I can't think of it. Does anyone know? Something around argument by labeling. thought terminating cliche. argument by word association. semantic argument. argument by I don't want to read all that so I'm just going to guess., argument by you kind of remind me of my college roommate so I'm pretty sure I know *exactly* what you mean. argument by valence.

It's really important I have a word for this so I can quickly identify when people are doing it and stop listening to them.

Expand full comment

A great elaboration of the idea that it's mistaken to trust the experts : https://fakenous.substack.com/p/is-critical-thinking-epistemically . I admit there is a real epistemic hazard in the area but I think the critics fail to appreciate the ability of non-experts (w/ appropriate epistemic humility) to usefully evaluate the strength of expert argument as well as the importance of these marginal changes in our credences. As well as being practically important it's ultimately how we identify experts deserving deference (physicist about the reactor) and those who don't (priests, theologians, some philosophers/ish [1] about moral/meaning issues)

I think a good metaphor for our ability to evaluate the strength of an expert argument is an interactive proof system. A verifier (amatuer) with highly limited computational resources and a random string can verify with high probability the claims of an untrustworthy computationally unbounded system (expert) with very high probability even though the verifier (assuming P!= NP) couldn't hope to determine the answer on their own.

Non-metaphorically, we can examine the claims made about the strength of arguments and then double check that the details seem to agree. Is the evidence as strong as claimed, do the arguments seem sound and grapple with the best counters? Or does it disappear into a mist of fog and certainty? Yes, you'll sometimes miss things and think the evidence says something it doesn't. Just don't over-interpret any one data point and be open to correction. Can they keep answering why/justify that questions in a way that suggests a coherent shared view or do they obfusciate and insist that you can't evaluate any of it until you've invested so much you won't doubt it's value. Does it seem like the actual factual content of claims seems to be a key issue or do the 'experts' seem happy to approve of claims that seem imprecise/different as long as they share the same social/value valence? It might be surprising but, over time, it's surprisingly easy to get a sense of whether a subject has the kind of experts who deserve or deference in credence.

Secondly, we rarely are asked just to bet on what's the most likely theory. We have to make calls in the real world with costs and benefits and even if evaluating the data yourself only should let you adjust your credences by 10% (relative) that might still be pretty important if it's the difference between a surgery and no-surgery.

Some claims are so improbable we can treat them as if they were really impossible but for most claims (including ivermectin) even the experts would, if their honest opinion was solicited, would give an answer that's far enough from 0/1 to make it plausible for cases to come up in the real world where that difference from 0/1 matters. I mean, should you treat the patient with COVID and parasites with ivermectin or another drug?

[1] I'm a big fan of philosophy (trained in it in grad schools and married to a philosopher). But the kind of philosopher who studies ethics or life meaning type philosophy isn't an expert in the object level (what's ethical how to live) the way a physicist is. They are an expert in the range of arguments and the scholarship..

Expand full comment

"If people are trying to confuse you about who the experts are, then to a second approximation trust prestigious people and big institutions, including professors at top colleges, journalists at major newspapers, professional groups with names like the American ______ Association, and the government."

I'd add in the qualification that experts should only be trusted within their field of expertise, even to a first approximation. When the American Medical Association puts out a 54 page guide on how to be woke, they can be safely disregarded, because the AMA is not an expert on political issues: https://www.theatlantic.com/ideas/archive/2021/11/leftist-language-policing-wont-fix-health-disparities/620695/

"Instead of saying, “Low-income people have the highest level of coronary artery disease,” it urges health professionals to substitute this doctrinaire sentence: “People underpaid and forced into poverty as a result of banking policies, real estate developers gentrifying neighborhoods, and corporations weakening the power of labor movements, among others, have the highest level of coronary artery disease.”"

Expand full comment

"The top and bottom chess sets are the same color, and only look black vs. white because of contrast effects."

This is such a dumb, pedantic point, but I'm a professional vision scientist so I feel obligated to make it. The top and bottom chess sets are the same *luminance*, not the same color. Luminance is a physical quantity, color is a perceptual experience.

Our brain assumes that nearby objects are receiving similar levels of illumination, and under that assumption it's likely correct that the top chess set is absorbing less light than the bottom chess set. Color is our brain's attempt to reconstruct object reflectance, and that reconstruction tends to be more accurate using the contrast along object edges, rather than absolute luminance levels. Either way, our perceptual attempt at that reconstruction is what defines color. The physical reality that giving rise to that percept is something else.

Expand full comment
Feb 16·edited Feb 16

In the past, humans died.

Women detected that men that gathered more resources through any means survived.

Men developed the methods to control populations, organizations and nature itself.

Men practiced them. Some lineages kept those practices and passed them on more successfully.

Men's interests in affairs, status, politics, ideas, science are known.

Control over one's destiny is one life-strategy. Letting others' (wage-slave, submission) is also another life-strategy.

There is natural variation everywhere.

Cooperation and competition are natural forces.

Successful cooperators can manipulate any group of entities or forces, just as we do with animals, we do to other humans.

Thus, from first principles, we already deduced there are competing cooperating groups that seek to control the world.

We see empires coalesce, we see states coalesce, we see corporations coalesce, we see cells coalesce, we see organisms coalesce; the main feature is they develop a greater 'consciousness' or emergence of ability to manage and control

their environment to their desired parameters.

Making plans against other humans is natural.

Anyone bibby-babblying about the possibility of thought of thought or emotion of thought is just circlejerking idiocy.

Network theory works just as well on human networks. Centrality, nodes, connections, information streams.


Humans at the highest elevation of power already enjoy all the world's excesses. Fame, wealth, women, land, properties, businesses. After all is said and done. Things not in control.

1. Human evolution, human societies, human capabilities (are limited)

2. Life, death is presently inevitable

Thus (1) and (2) are both logical pursuits to all those who keep the principle directive (continuation of their own informational-genetic legacy).

No need for affirmation of facts.

No need for affirmation of data points.

First principles dictates everything.


Resources are rate-limited (in regeneration) and finite locally. Scarcity is real.

Killing off, or preventing other competitors from rising to the top is seen everywhere through the animal kingdom. More resources for me and my kin, my tribe, my group = less for everyone else. Dynamics rule. Useful subjects = generate more resources, generate more hedonistically desirable things. Cognition, aptitude, willpower and opportunities intersect to product civilizational-products. Thus, a logical policy, subjugate all those who do not fit with my value functions) and exterminate the rest.

Once again, no need for hard-theories or anything. This is a fact. This is observed in all primates. Stop running from evolutionary theory. There is no reason to not believe we whom have the same substrate as other entities to behave any differently, just because something is more complex or we have more intelligence does not mean our primitive inclinations differ any aspect whatsoever. It is irrelevant how many groups there are, because in the long run, genetic fixation of conserved sequences occur once a certain threshold of dynamic equilibra has been established between the lower-levels of energy and higher-levels of energy. This is a natural anti-entropic process of all living species, all living communities and I dare say, all algorithmically-induced structures composite of the universe.

Expand full comment

On the term "conspiracy theory":

PSA: In order for an assertion to be a #conspiracytheory," it must allege a SECRET agreement between two or more people to do something nefarious.

Not just something you believe far fetched


Expand full comment

Typo: "Instead, let me try describing exactly what I would advice I would give"

Expand full comment

Having read all your links at the top, I think you're being way too kind to Kavanaugh in this article and had the appropriate tone in your first.

Expand full comment

>Nobody needs to be sure whether Kennedy was assassinated by a lone gunman or not.

Are you sure? Because, while I'm not especially familiar with conspiracy theories, most of the ones I've heard about this one involve the CIA being the ones who assassinated him and/or covered up the truth, which seems massively important to me. If it was just the case that Oswald had a hidden partner who was equally deranged but got away with it, then it doesn't matter anymore. But if there was actually a conspiracy, and a powerful U.S. government agency committed treason, got away with it, and still exists, then it would be incredibly likely that it is still just as corrupt, just as powerful, and therefore willing and able to commit similar levels of treason today. You can't just believe that the CIA assassinated our own president and not care whether anyone knows about it. This would be incredibly important if true.

A lot of conspiracy theories are this way, and I think you're just underestimating them because the weak version of the Atlantis one (where they're an ancient society but don't have crazy sci-fi technology) probably wouldn't matter. But for many, even if the object level fact that the conspiracy theory claims didn't matter, the conspiracy by elites to cover it up via deception and malice (not just ignorance) is of great importance because it casts doubt on their honesty and legitimacy as authorities. If they're lying to you about this, what else are they lying to you about?

You don't have to believe that they're true to acknowledge that if true they would change a lot about how appropriate trusting the experts was.

Expand full comment

Regarding the "naive positions" on conspiracy theories...

I think of a conspiracy theory as one that uses certain (bad) steps in its reasoning - usually the "evidence to the contrary is being covered up/generated by powerful people" thing.

The obvious problem is that the opposite of this, "always trust the experts", also has its problems. So conspiratorial thinking is bad, but you can argue it's exact opposite is also bad.

My answer is that it's not just a question of distrusting experts, having an idea of when to trust them and when not to. I don't blanket trust experts but I have certain expectations about the *way* they'll fail, that conspiratorial thinking tends to violate. Could fill a much longer comment about when to trust vs distrust experts and I don't know if I could describe it well ... but it has a healthy dose of "I know it when I see it".

Expand full comment

Regarding whether you should have addressed the question at all...

I have some sympathy for the "don't elevate it by treating it seriously" idea. But I also think it's useful for someone to earnestly debunk something like this. Reminds me of how I read somewhere that most people aren't very price sensitive when shopping, but stores have to compete on price because of the small portion of people who are hyper-price-sensitive.

Ultimately the thing I'm against is certain conspiratorial paths of reasoning, not the conspiracy theories themselves; and relatedly in the case of ivermectin, the way it played into anti-vaxx bullshit. That's what bothers me about the ivermectin thing, not anger at the idea that deworming meds might be useful against COVID.

Expand full comment

I don't think an apology was owed. The reason Scott got angry was because the posts which triggered this had a dismissive and slightly mocking tone that was seemingly meant to lower the status of people like Scott. It doesn't sound like this anger clouded any judgement, though I appreciate the extra post. So yeah, no need to apologise for getting angry at a status attack.

Expand full comment

I would add one piece of advice:

"Many actual conspiracy theories share common aspects that make them effective memeplexes. Familiarize yourself with the most common cognitive biases, logical fallacies, and rhethorical strategies, and cultivate an internal alarm that goes off when you see more than one or two of them in one place."

I actually took a university class on Pseudoscience, taught by a physics professor and dyed-in-the-wool skeptic. He handed out a useful list of criteria to distinguish pseudoscience from actual science. Three of the most salient were "mistaking noise for the signal", "arguing from ignorance", and the "false dilemma". I'll check tonight if I can find the list.

That class, and later internet exposure, also informed my view of the skeptic community - namely, that skeptics were often the only ones to do very thorough takedowns of controversial and outlandish ideas. Some of the best skeptics became so experienced that they could smell bullshit from a mile away, and grew tired of doing a deep-dive every time; some not-so-diligent skeptics never put in the hard work to aquire that skill, and jumped straight to the jaded, dismissive cynicism that Scott deplored in his first post.

Expand full comment

Dismissing the Covid lab-leak hypothesis out of hand is probably the greatest fillip to conspiracy theorists so far this century.

Expand full comment

Generally the difference between a conspiracy theory and a theory is a conspiracy.

In any given case it may be true that millions of people are mistaken, or that a small, powerful group of people has interests that are different from ours.

But hardly ever does an entire class of people deliberately work together to tell us something obviously false. The likelihood that they did gets lower as the group gets bigger or the lie becomes more blatant.

Likely: "The New York Times/Breitbart has a powerful incentive to believe and print things that are not true, and it's important to remember that when reading their articles, and take into account their track record and their incentives."

Possible but be careful: "A small subset of folks in intelligence agencies worked with the president to sell arms to both sides of a middle east conflict in order to pay drug lords to fight socialists."

Almost certainly wrong: "Every healthcare worker on Earth is working in concert to convince us that people are dying of a pandemic, a fact most of them have personally observed to be false."

Expand full comment

Never underestimate how narrow common knowledge really is. Should we try denying oxygen to ivermectin? Look, there are plenty of people who don't know what *oxygen* is, and they aren't idiots or children either. Plenty of people have not heard of ivermectin even now.

For that matter, the idea that Dawkins might want to not debate Christianity so as not to spread their ideas at all actually makes sense, in my experience. I'm from a country where Christianity isn't the dominant religion historically. I know many people whose prior on Creationism and Christianity in general went up because of atheists' arguments including Dawkins'.

Whether the effect of a debunking is to increase or decrease overall public credence in an idea is an empirical question that can only be answered case by case. Sometimes it may really be better to let it lie than to dignify it with a response, but I don't think there are any simple principles we can apply to know when.

Expand full comment

Really good post, thanks. A comment about "trusting experts": I'm a biologist. When a mathematician tells me that a theorem whose meaning I sort of understand is true, it leads me to assign a truth value of >99%, whereas when an economist tells me that a macroeconomic intervention whose meaning I sort of understand will lead to a particular result, I assign a truth value of maybe 60%. I don't consider the mathematician a better expert; instead I think mathematics is a topic where "true" and "false" are exceptionally clearly defined and understandable. (In my own micro-sub discipline I tend to be very skeptical of results contrary to my own opinion, but that's another story).

Expand full comment

It's great but I think it should be expanded to mean something other than just conspiracy theories, such that it becomes a point about general false beliefs. I think belief in sinister conspiracies are actually right-coded in some way and as such much of what gets classified as conspiracy stuff is actually just false ideas/propaganda oriented towards right-wing people. An idea need not involve a conspiracy in order to be damaging in some way, and a lot of false ideas held by leftists appear to be damaging to themselves and society at large despite the ideas themselves sound harmless and obviously good to the people who hold them. I would cite examples, but it won't do any good because that would just start looking like mean-spirited left-bashing and the leftist reading this may feel defensive for no good reason - I like leftists in general!

I can supply an example from my own experience: as a youngster I came into contact with the ideas of marxism and communism and read up on it a lot. I have speculated a lot about just what about my own psychological profile made me fall for it, but in the end I just had to admit that the marxist literature is very impressive and describes and diagnoses some real problems really well. It's also true that most of the counter-arguments I encountered were pretty dumb and seemed to lack good faith or even seriousness. I had not yet learned the lessons of the cowpox of doubt, or encountered rationalist forums.

Speaking of, it used to be that I sometimes linked my friends to one certain SSC post to bring them out of their overconfidence in some pet theory or issue; I think it was this one: https://slatestarcodex.com/2015/08/20/on-overconfidence/

I have tried to replace that one with the trapped priors post but it was a bit too heavy unfortunately!

Expand full comment

Too many words.

In the middle you point out that: it still sounds too rambling [rambly?]. Well it was and still is! An essay is not a conversation at a party where you can't really go back and unsay things, such that there is unavoidable accretion. In an essay, you can (and should out of respect for your readers and yourself) edit out stuff and rearrange and simplify and distill before you release it. And the essay/blog is not like a comment to blog which is more of a quip.

Paraphrasing the poet John Ciardi that which is written without hard work is usually read without pleasure.

Rewrite and resubmit.

Expand full comment

Just a datapoint here. Some of my in-laws, who are more conservative than I am, shared something about ivermectin. I replied "I was curious about some of the results people were seeing with ivermectin, and looked into it. It turns out that most of the successful studies of ivermectin were done in places where parasitic worms are common. So ivermectin did help Covid patients survive, but mainly by reducing the burden of parasitic worms the body had to deal with at the same time." It satisfied them-- they weren't committed to the idea, they'd just heard some interesting data and once they had a good explanation for it, they didn't need to press the point.

I guess my point is this sort of thing can help people who aren't so committed to the theory yet.

Expand full comment

Umberto Eco, The Force of Falsity

Expand full comment

Since Scott thinks theology is a helpful lens through which to consider what we often call conspiracy theories, I'd like to jump in with another example--which backs up both his position and many of the comments. In his commentary on the story of Ahab, St. Gregory has to account for the fact that God seems to a) live in physical space, since he has angels on the right and left of his throne and b) seems to deceive Ahab. One argument takes care of both problems: the angels described as being on the right are those loyal to God, while those on the left are the rebel angels--both do his will. That is, even in their efforts to manipulate Ahab, the fallen angels can't help delivering God's truth. (Ahab's problem isn't rational; he just needs to figure out which prophet to listen to. Of course, that's never easy. Sometimes the consensus is right, but the ancient Hebrew consensus was often wrong, which is why prophets were so necessary, even if they were so often ignored. As many of the comments have pointed out, identifying authority worth trusting is at least as hard as thinking correctly. Pierre Kory sure _seems_ like someone you can trust on the question of acute respiratory illness.) But I'm not introducing St. Gregory just because the consensus here seems to agree with him. I am struck also by the way the dialectic of good and bad angels seems to be only stable approach to truth. Toby Ord and Nick Bostrom make a pair of points that argue that reliance on either angelic cadre would be suicidally dangerous. Bostrom argues that in a world where a "black ball" technology is possible, human civilization requires global surveillance and strict information control to protect ourselves from extinction. Ord points out a subtler danger: that any attempt to rigorously encode what we call we now call the consensus can only lead to a dystopia of crippling opportunity costs. We would be fools not to fight the zombie armies of misleading narratives and polluted data, but we'd also be naive to think that somehow banning them would leave us any better off.

Expand full comment

I feel like this entire post is undermined by you reminding me that you think the question on who should have succeeded the prophet Muhammad is a pointless question. The only question I could see as more important in Islam is the question on if the Quran was created or not.

Expand full comment
Feb 16·edited Feb 16

"including professors at top colleges, journalists at major newspapers, professional groups with names like the American ______ Association, and the government."

How about the Association of American Physicians and Surgeons? Oh, but they are pro-Ivermectin.


Turns out there's nothing stopping people with non-mainstream ideas to give themselves a fancy official-sounding name! Oh, and they've existed since 1943 so it's also not like the organisation was created specifically to give some fake legitimacy to the covid-quack-cures cause.

Expand full comment

My view on ivermectin, like my view on most COVID treatments in 2020-2021 was something like "Sounds neat, I have no idea if this works, I'll wait and hear more information when it's available." I don't think I was ever "pro" ivermectin, but I was definitely more sympathetic with people prescribing it and taking it when the response went from "we're trying a variety of treatments for COVID" to "HAHA, those idiots are eating horse paste!" Even with no knowledge of the pros or cons of using it, the response was ridiculous. I don't remember a response like that to any other drug or treatment in my life. It was the kind of response that you might expect when someone is talking about huffing paint. Even with that the news is more sympathetic to the people doing that than they were with ivermectin.

Expand full comment

There has to be some connection to power to make any of this make sense. If people with no power believe some conspiracy theory, it does not matter at all (and it might even be good).

For example, lots of people believe that the earth is flat. Given who these people are, it has literally zero effect on society or scientific research.

On the other hand, lots of people believed Russia successfully penetrated US elections enough to change the course of our politics. Those people were sufficiently powerful enough that it's arguable that hundreds of thousands of people are now dying in a war that wouldn't have happened if no one had believed that conspiracy theory.

I'm aiming for extreme examples here, but take something more mundane. If people start to believe Graham Hancock's theories, it's hard to see anything too bad coming of that. Maybe there will be more funding for archeological work in the Amazon or underwater. I don't really see a significant downside. Hell, it might even be good for archeology if more people started to care - even if that caring was a bit misplaced.

Expand full comment

Minor, somewhat pedantic correction.

The chess pieces are the same *pixel value*, not color, since color is synthesized by our brains in relation to everything else. While this seems silly to say, there are contexts where this matters a bunch more.

Expand full comment

On Scott Aaronson's point, obviously some people are very dumb and would be better off deferring to The Proper Authorities on everything. But most people should think for themselves because they know their personal circumstances better than The Proper Authorities do (consider Covid, when The Proper Authorities were shouting "YOU MIGHT DIE!" because septuagenarians might die, a teenager would have done well to consider the fact he was young and healthy).

Ideally, really stupid people would trust the authorities and everyone else would think for themselves. Unfortunately we can't just say "Trust the authorities if you're stupid, otherwise think for yourself", that will obviously not work. We can say "Think for yourself" and let the stupid people form their own stupid ideas, or we can say "Trust the experts" and have everyone's knowledge of their personal situation get bulldozed by experts who have never met them, but Scott's advice on conspiracies is ten paragraphs long, and when it comes to public messaging you get about five words.

Interestingly, because "trust the experts" is stupid advice, some people are going to figure out that it's stupid and disregard it. Stupid people are bad at figuring things out and therefore less likely to do this. You can't make all the stupid people trust experts while all the people with common sense use it, but you can achieve a differential!

Expand full comment

Reading Zvi during covid has made me feel like a conspiracy theorist. I've gotten that you-might-be-crazy look my uncle who believes Hillary is a reptilian gets for claiming things like:

- "being outdoors massively reduces the risk of transmission" in the summer of 2020

- telling my mother that maskless indoor dining (pre vaccination) is dangerous regardless of whether it's allowed by the law at that time

- asking my family to try fluvoxamine for my unvaccinated 80+ year old grandmother who got covid in early 2021

In a large new field your priors should be lower. Looking into ivermectine was not an epistemic mistake. It came down to a) ivermectine works or b) peer reviewed papers are wrong or even fraudulent (as they turned out to be). Saying that peer review isn't enough evidence to look into something because the wrong people believe it is special pleading.

Expand full comment

I think the basic issue here is that most people, regardless of their actual beliefs, are in fact conspiracy theorists. "Conspiracy theory" is, in a sense, the most natural form for a human being for a theory to take.

Indeed, the very idea of a conspiracy theory is, I think, something like a conspiracy theory; you are supposing that people are conspiring to mislead and delude you, that nobody can be trusted, except these shining individuals on the hill who will lead you out of the darkness and into the light. "Trust the experts", as you note, just leads to people demanding you listen to -their- experts.

(Also, I'm a crackpot; I'll readily admit to being a crackpot. My opinion on how physics works shouldn't harm scientific progress at all, at worst, if I pester physicists a few times, I make them tired of crackpots before some other crackpot does - and shit, maybe something will come of it; I didn't bother anybody for a decade, until I thought one of my crackpot ideas might be, while not exactly correct, might be a precursor towards building something correct; Julian Barbour's scalar relativity will, I expect, get there eventually, but I think I've found a shortcut. But granted, I'm more open about being a crackpot than most crackpots, and when I do bother physicists, I open with that fact, and if I am entirely off-base, I'm probably a useful training exercise.)

Expand full comment

Hey Scott -- the Ivermectin meta-analyses you did were actually really helpful to me. I wasn't an Ivermectin believer per se, but I honestly didn't know what to think. I heard from one side "there were these studies that look really promising - this and this study said that there was a statistically significant effect", and from the other side I heard "These heretics are taking HORSE DEWORMER! Burn them at the stake!" So I was leaning lightly towards the side that wasn't screeching like lunatics, but also still unsure because this is outside of my area of expertise, and resolved to do whatever my doctor told me if I were to get COVID.

You convinced me that, most likely, Ivermectin was a bit of a red herring, outside of cases where I was likely infected by a parasite. And I honestly thank you for it. It put me in the 99.9% sure category. I've pointed your essays to other people who are like-minded, and most of them also found it convincing.

And I do actually like listening to Bret Weinstein's podcast; he's the right kind of contrarian that makes me think harder about a lot of things. But I'm also a devout Catholic, so I disagree with him on a large number of things. I am one of those weirdos that likes listening to contrary opinions to my own, as long as they're cogent, non-screeching, and internally reasonable.

Expand full comment

There is one other point to consider, which is that over the last thirty years or so, the notion of conspiracy theories as problematic has lost a lot of its value as the term has been politicized.

Remember Hillary Clinton proclaiming that there was a "vast right-wing conspiracy" trying to smear her honorable husband with rumors that he was having an affair with a White House intern? Turned out to be absolutely true.

Remember all the rumors flying about the NSA abusing Patriot Act powers to spy on Americans, which was dismissed again and again as tinfoil-hattery by people in the know? And then Ed Snowden blew the lid off the whole thing and turned "conspiracy theory" into some very disturbing, very real conspiracy fact.

Remember how experts exhorted us over and over that the origins of COVID-19 in Wuhan couldn't possibly have been one of those viral research labs where they're studying bat coronaviruses, and that only a conspiracy nut could possibly believe such a racist idea? And now it's generally acknowledged that, though we'll probably never know for certain, this is the hypothesis that best fits the facts, by far.

Remember the massive, unprecedented effort to suppress the story of Hunter Biden's laptop, with experts everywhere calling it a conspiracy theory and a disinformation campaign that probably originated in Russia, fact-checkers "proving" that it was false, and organization that broke the story being kicked off of social media entirely? And now that it's no longer politically convenient to hide the truth, it's openly acknowledged that yes, the laptop is real and really did belong to Hunter Biden.

One important reason why so many people are drawn to things that are termed conspiracy theories nowadays is that they've seen so many things that got dismissed as conspiracy theories turn out to be openly acknowledged as true once the passage of time changes the incentives for people to talk or not talk about the subject. It's getting harder and harder as time goes by to tell the real garbage from the stuff that actual conspiracies truly are trying to suppress or discredit!

Expand full comment

Dr Mobeen Syed Of Beanheads fame has several YouTube videos that can help understand pharmacology of Ivermectin. I began using Stromectal which is brand named ivermectin way back in the late 80s. It’s amazingly safe for children. Adults too.

Expand full comment
Feb 16·edited Feb 16

I think Kavanagh's position, and others like it, is rooted in a crisis of confidence in Truth. I don't mean some divinely revealed Truth, but rather the idea that 1) there is an objective underlying reality, 2) human beings have the capacity to discover and describe that underlying reality fairly accurately, and 3) the most accurate description will eventually be widely accepted.

I suspect Kavanagh would agree with 1) and 2) but have doubts about 3). This is understandable -- we live in a time that *seems* to make it hard to believe in 3) (though I'm not sure the current era is really an outlier in this regard).

But if you don't believe in 3) I think there is a strong temptation to form an Oligarchy of Truth Tellers that will go through step 2) and dispense their "knowledge" to the masses. This doesn't have to be an intentional effort and it doesn't have to rely on formal institutions or hierarchies -- it can emerge organically.

But I simply do not believe that an Oligarchy of Truth Tellers would be any less likely to fall victim to untruths. In fact, I think they would be *more* likely, as the natural tendency is for such an oligarchy to become isolated, entrenched, subject to group-think, and protective of its special status at the expense of Truth.

I don't think there is a problem with having little cliques of truth tellers, as long as there are lots of them across domains, and there are strong norms against any one clique become too entrenched within a given domain. The best norm that I know of to ensure that remains the case is to never suppress any idea, and to answer all critiques -- even the ones leveled in bad faith. Or as Jonathan Rauch has put it, no one gets the final say, and no one has personal authority.

This means that sometimes you'll provide a "platform" for bad ideas, which is why you have to trust People. I don't mean you have to trust every individual, but you do have to trust that on average, and over the long run, most people want to, are capable of, and will believe the ideas that come closest to Truth.

Trust in People seems to be relatively low right now among those in traditional institutions of authority (The Revolt of the Public has some interesting arguments for why that might be, though I don't find all of them convincing). But I don't see any other alternative that doesn't ultimately lead to the Oligarchy of Truth Tellers and, from there, to stagnation (or even regression) in the pursuit of Truth.

I think the crisis of confidence in Truth leads people in those cliques/oligarchies (some do exist, sadly) to reactionarily lash out at those going against the clique/oligarchy consensus. "Conspiracy Theorist" happens to be the slur of the day. In a different place/time it might be heretic, anti-revolutionary, etc.

I simply am not that worried about conspiracy theorists. I'm actually more worried about the over-reaction. In that sense it's a lot like terrorism (note: I am *not* making a moral equivalence!). Terrorism is undeniably dangerous, but it doesn't, in-and-of-itself, present an existential risk to society at large. The existential risk comes from fear, and the reactionary bad policies that emerge from fear. "Don't publicly debate people with 'bad' ideas" sounds a heck of a lot to me like the kind of reactionary bad policy that, if carried far enough and for long enough, could start to pose a threat to liberalism and the pursuit of Truth.

It can be absolutely exhausting to answer all critiques, which is all the more reason to applaud Scott's efforts.

Expand full comment
Feb 16·edited Feb 16

Can we go back to first principles?

Scott sort of gets to this about halfway through his response, but do we agree that the idea that (a) the idea that ivermectin might be effective against Covid is a crackpot theory along the lines of Atlantis and perpetual motion, such that even debating it gives it oxygen, or (b) is the effectiveness of ivermectin against Covid still an open scientific question, even if the weight of evidence appears to be that any benefit from Ivermectin is likely to be somewhat small?

I was under the impression that Scott and Alexandros, at least, agreed that it was the latter, but the discourse appears to keep assuming that this is some crazy crackpot theory that Scott resoundingly and convincingly refuted. Did I miss something?

Expand full comment

I remember a brief time, around early winter 2019, when the idea that there was some new virus called Covid and that it had the potential to become a pandemic was dismissed as a conspiracy theory, and an anti-Asian racist one to boot. Well, that escalated quickly.

Trusting experts is probably the right thing 99% of the time, but the other 1% of the time you end up stocking up on masks and food and sanitising gel before everyone else.

Expand full comment

There is also an *opposite* of conspiracy theory. Don't know if it has a name. Let me describe it:

I have a friend who seems to believe that everything any official institution does is optimal. Like, if you show him an article about how some bureaucrats made a mistake, or some institution abused the people it was supposed to care about, or how some law creates perverse incentives, etc., he would automatically dismiss it as a conspiracy theory.

In his world (it seems to me) every institution works perfectly. There is no such thing as principal-agent problem. (Or maybe there is, in private companies, but definitely never in state-organized institutions.) There is no such thing as government employees optimizing for their personal benefit, rather than for the supposed goals of the organization. There is no such thing as government making a stupid rule.

In his world, only bad people, who all work outside government institutions, do bad things. Which is why we should have more government and regulation, obviously. Bad people also share lies about the government making mistakes, but only gullible people (such as me) listen to them, instead of believing the experts (in this context this refers to employees of the relevant institution).

I am probably strawmanning his position a little, but this is how it seems to me. It is extremely frustrating. I am unable to discuss things happening in real world with him, because he dismisses all news of something being wrong, without even hearing the evidence. Just the claim that something is wrong, is obviously a conspiracy theory, no need to waste time debating the details.

Do you also know people like this? Does this officially have a name?

Expand full comment

From a quick skim of comments I don't seen anyone taking this position, so let me do it. To make it clear, Mr. Kavanagh is criticizing you for a good faith deep dive into the facts of a controversial theory. For me, that's a red flag right there.

To put it in rationalist terms: there is this classic fallacy (forgot the formal name) where one applies higher standards to others than to oneself. This seems at play here. You are being criticized for engaging too subtly and closely with the facts of a theory, and thereby possibly causing negative second-order effects. But the criticism itself fails the same rule! It's also making an exceedingly subtle ethical argument about your article, without giving any consideration to the possibly negative second-order effects of making such a judgment. Thus the argument disqualifies itself.

But let me take the occasion to make a bigger point. There is a major intellectual trend these days to over-ethicize everything, which is another way of saying that culture is growing judgmental. I understand there are multiple moral reckonings going on, from the environment to inequality to respect for different cultural and ethnic groups, and I can see how that brings a heightened awareness to ethical issues into everyday life. But a culture of ethical maximizers is just as narrow minded and spiritless as a culture of anything-else-maximizers, so at some point one has to put a limit to that.

There is such a thing as a basic personal sphere of freedom, where one gets to follow their interests and whims, and express oneself as gracefully as one can muster. It's a basic thing of any kind of functioning society - I believe at some point you called it "slack". My take is that a good-faith open-ended deep dive into *any kind of facts* falls well within it, and anyone saying the contrary is making an extreme claim of the kind that requires extreme evidence.

I've been reading your blog for many years already, and what has kept me here so long is a kind of love for truth wherever it may take you. And that sometimes means truth over convenience, truth over niceness, truth over established "facts", truth over committed contrarianism, and yes, truth over (often pessimistically imagined) ethical consequences.

If the EA movement has taught us one lesson (besides doing plenty of good work in the world, yes), it's that it only takes a brainy human being with some time in their hands and the argumentative abilities of a modern chatbot to conjure up moral hazards with possible millions of victims, as long as they are far enough in the future, and they can put the right number of "ifs" in front. It's like obsessing about sin is back in fashion, except that it's not controlled by the Christians anymore. But you know what? Fuck that. We'd be better off killed by a cartoon paperclip maximizer than having our spirit sapped by an army of maximizers of someone else's presumed utility.

Expand full comment

I think your advice to the internet-connecting kid needs to include also the fact that sometimes, high-prestige people and institutions are spouting nonsense. Sometimes, the high-prestige institutions are telling you that there is a rash of ritual satanic sexual abuses in daycares, or that the low rate of black kids in magnet schools is the result of racial discrimination in schools.

Your goal is to avoid walking off a cliff. One way you can walk off a cliff is to wander into some bubble and convince yourself to build an evidence-proof shell around some false belief. Sometimes, the bubble is found in a weird cul-de-sac of internet weirdos. Sometimes, it's found in the newsroom of the NYT or the psychology department at Harvard. But I think the critical thing is to try to avoid building that evidence-proof shell. Conspiracies are sometimes true (though usually they're more sordid and sad than the stuff of movie plots), but the conspiracy-theory mode of thinking is broken.

Expand full comment

"At what point in this process - which second of which day - did it switch from plausible-but-false scientific theory to conspiracy theory? "

When Bad People started believing it. Although there is a countertheory that it happened when Good People stopped believing in it.

"figure out whether you really need to have an opinion."


Life became vastly better when I realized that on the overwhelming number of questions in the universe I had neither the ability, knowledge, or moral authority to take action and since opinions without actions exist only in my skull, there wasn't actually a point in having most opinions.

Related, the problem with "trusting the experts" is that one almost never interacts with an expert, nor gets their opinion. Fauci (PBUH) is not actually a scientist, nor is he a doctor (his license to practice expired, when? A half century ago?) He's a bureaucrat. You do not get to be the head of a government agency be being good at Science(tm) you get there by being good at office politics. To the extent that there are actual experts involved, by the time their decision gets to the lips of a newsreader, it's gone through multiple levels of edits and adjustments by people who are much more concerned by the policy implications of public health than the factual accuracy.

Expand full comment

Never mind about believing or not believing conspiracy theories, what I have a problem with believing is Scott's premise in the previous post, which is that anti-conspiracy writers mostly just say "conspiracy theories are stupid" and "trust the experts" and don't provide any evidence for their beliefs.

I haven't read much on ivermectin - in fact I've never heard of it outside of this blog - or on Scott's initial conspiracy theory example, Atlantis. But the conspiracy theories I have read about - Shakespeare authorship, JFK assassination, moon landing, 9/11 - have loads of counter-argument material showing why the conspiracy theory is nonsense. (For Shakespeare, the best source is a superbly written and argued book called "Contested Will" by James Shapiro, which also offers a theory as to why proposals for alternative authors arose in the first place.)

True enough that any given outbreak of the conspiracy may be met the way that Scott Aaronson describes, by sighing or ignoring it. But that's because the counter-arguments are already out there. Just refer to those.

Expand full comment

One useful lesson for everyone is how to evaluate experts and areas of expertise. It's easy to look at experts as a big group and see them as all the same (and then when you see some experts make predictions that turn out entirely wrong, maybe you assume that all experts are full of shit).

a. Not all areas of expertise are equal

Some areas of expertise have a lot of contact with reality in ways that give them opportunities to find out when they're wrong, and institutions and incentives that support learning when they're wrong. Chemists know quite a bit about how to synthesize certain compounds, both becuase they know a lot about how chemical reactions work and because they do it and make sure it works. Structural engineers know quite a bit about how to build bridges that don't fall down, both from having an underlying model of strength of materials and how to calculate forces in a structure, and also from having built a lot of bridges and carefully examined the cases where one fell down.

By contrast, other areas of expertise have little contact with reality and few opportunities to find out they're wrong, and/or institutions and incentives that do not support learning when they're wrong. For example, macroeconomists are very smart people who mostly don't have great ways to test their theories most of the time. Philosophers are even smarter people who have approximately no ways to test their theories.

I think of this as a kind of a spectrum, where on one side you have areas of science where the consensus is mostly driven by the outcomes of experiments or observations, and on the other side, you have areas of science where the consensus is mostly driven by who makes the most convincing arguments.

Along with that, some areas of science and practice are just messier than others. Medicine is notoriously messy, which is why determining whether ivermectin or paxlovid is a better choice for treating covid requires careful experimental design and statistics, rather than just asking a group of doctors what seems to work best. Social psychology and economics are even messier, and it's genuinely hard to untangle what's going on a lot of the time even with the best efforts of the scientists involved.

This distinction matters a lot when you're trying to understand how much weight to give to expert consensus. How much weight should we give to the expert consensus of physicists on the feasibility of a new power generation technology? How much weight should we give to the expert consensus among sociologists on the feasibility of a new proposed welfare reform scheme? How about the expert consensus among black studies professors on the cause of the black/white difference in SAT scores?

I'm pretty sure that the average academic philosopher is smarter than the average structural engineer. But I have *way* more confidence in the consensus position among structural engineers than the consensus position among moral philosophers.

b. Expertise is generally pretty narrow.

Experts are expert in their area. You don't have to go too far outside their area for their knowledge to become not all that helpful. Virologists are experts on virology, and tend to know a lot about adjacent fields like molecular biology, genetics and immunology, but probably won't know so much about (say) the practicalities of closing borders to stop viral spread.

Expert speculation outside of what's known is useful--often it is the best guidance we can get--but it's important to recognize that it's still just speculation. All kinds of stuff sounds plausible at first, but turns out wrong when you investigate it in depth.

Drug development offers constant examples of this--really smart people working for companies that are betting tens of millions of dollars on being right will pretty routinely find some compound that seems like it should help some disease, find an animal model of the disease and show that it treats the animals successfully, find some markers for the disease in humans and show that it improves the markers in humans, and then eventually run a full trial and discover that the treatment does no good at all on the actual disease.

c. Science is good at determining factual questions (what virus causes covid, how does it spread, etc.), but doesn't give any particular help in determining questions of values or morality.

Asking an epidemiologist how to slow the spread of a new respiratory virus is sensible, but this doesn't tell you what policies you should enact, since that question rests on moral or political questions as much as practical ones. The infamous public letter about how the BLM protests during covid were okay because racism is a public health crisis too is a good example of subject matter experts crossing from speaking about their expertise to expressing their values in an area in which they are probably not really more expert than anyone else. "What measures might decrease the spread of a respiratory virus" is a good question to ask an epidemiologist. "What public protests ought to be allowed, and what protests ought to be suppressed" is not a question which an epidemiologist is an especial expert.

d. Some fields are heavily politicized, and activists will often try to capture the mechanisms of determining consensus.

There are in-public examples of academics in some fields having their careers wrecked by working on politically-unacceptable questions, or coming to unacceptable conclusions. That's a pretty strong signal that those fields' mechanisms for determining truth by consensus is likely to be broken.

There are also public pronouncements by some scientific bodies and journals on political or philosophical matters, sometimes including policies that certain findings or research areas should not be published or funded. These make it very hard to have confidence that what comes out from research in those areas is the best available picture of reality, rather than the best available picture of what won't get you fired for saying.

e. What experts are heard in public is largely a function of which experts will be interviewed by media outlets, or given a platform by particular organizations.

For example, in the runup to the Iraq war in 2002-2003, it was pretty hard to find any kind of expert on the Middle East or WMDs or anything else who was opposing the war. Such experts existed, but they didn't get a lot of airtime in prestigious media outlets. This is an old problem that journalists have talked about for a long time--when a friend of mine was getting a journalism degree many years ago, I remember him talking about the "golden rolodex" that determined which experts got called, and how this could distort coverage. Ideological commitments by a lot of prestige media outlets makes this worse.

All these make it a bad general strategy to just follow the experts and trust them. And this is a problem, because expertise is real, and your occasional Googling and reading stuff online and listening to podcasts is actually not a substitute for doing a PhD in virology or spending years working as an environmental regulator or whatever.

There's an art to engaging with experts and reasoning sensibly about what they tell you while keeping in mind both their limitations and your own. It's a very rare talent, but one worth cultivating. Less because you might need it to determine some internet controversy than because if you work in technology or industry or science, you'll need to spend a lot of time doing exactly this. The usability expert on your team knows a lot more than you do about human factors and how to set up usability testing and what the results mean, the software developers know more than you do about they can reasonably build in a given amount of time, the security guy knows way more than you about what attacks are known that might apply to your product, and yet you, as the manager of the team, have to integrate all that expertise into making a decision, without either ceding the floor to whichever expert is the loudest or ignoring an expert in their area of expertise.

Expand full comment

OT. Assuming that this is real, the conversation with Bing's AI is astonishing, and sad.


Did *anyone* predict the emergent properties of these Large Language Models?

Expand full comment

Scott writes:

"when people do formal anonymous surveys of IQ scientists, they find that most of them believe different races have different IQs and that a substantial portion of the difference is genetic."

I followed the link, and it doesn't say that. It says:

"97% of expert psychologists and 85% of applied psychologists agree that IQ tests measure cognitive ability “reasonably well”. And 77% of expert psychologists and 63% of applied psychologists agree IQ tests are culture-fair"

That's a pretty big difference!

Expand full comment

My personal opinion is when people started saying (screaming?) Trust The Science in an emotionally entangled way then I need to check it out more closely.

Additionally I think that the question is not whether the Trust The Science crowd is "wrong", but whether they actually have sufficient data to know their answer is correct. The more typical response is not a conspiracy theory but a conclusion that the answer is still unknown at this time.

Further I would say there has been a large increase lately in people placing themselves on the mantle of science and making dubious proclamations with little pushback from whatever one would call science authority. Sorting this out personally takes time and effort.

Expand full comment

There is a third way. Believing a conspiracy theory vs not falling for it -- that's a false dichotomy.

It is safer and often more useful to not know nor care about a theory:

- Did Covid originate in a lab or an animal? Don't know, but knowing wouldn't affect my life at all.

- Does Atlantis exist? Maybe, maybe not, anyway I won't ever move to live there :-)

- Does Ivermectin work? Hopefully I'll never need it.

Granted, I had fun reading this post, and I am thankful for people like Scott who stay awake at night because "someone is wrong on the Internet". But I can't shake off the feeling that my life would be better if I had instead spent the time with someone or something that really matters for my life.

Expand full comment

I can't empathize with the Kavanagh position. People like him base their arguments on a belief that conspiracy theories are regularly causing massive harm and the theories of experts and institutions rarely do. Sure, in general, the prestige media and respected institutions are more accurate than conspiracy theorists, but I am hard pressed to think of any major conspiracy theory that has caused any significant amount of harm, while there are significant expert consensus-lead programs that *have* caused massive harm and the conspiracy theorists were right on from day 1 (like the War on Terror).

Yes, conspiracy theorists are natural contrarians and so they create a counter narrative for nearly every expert-held position. But you need both a mistaken belief AND power to cause harm in the world. Conspiracy theories sometimes cause small scale harm. A man in California recently killed his family over the belief that they had been replaced by reptilians. But expert theories cause massive, world-wide harm. The War on Terror caused millions of pointless, preventable, human-caused deaths (killings, murders, whatever the best term might be) including torture, immolations, live crucifixions, etc.

On top of all this, keep in mind:

A. Snide dismissal of reptilian shapeshifter theories by Slate, Vox, etc will covert zero people away from a belief in reptilians.

B. Refutations of reptilian theories that take them seriously (maybe critical analysis of videos that supposedly show the Clintons blinking in a weird way) will convert some small number of people away from belief in these theories.

If you ruthlessly criticize every mainstream, expert-led proclamation, and the prestige media takes these things seriously, and even arrives at the "correct" position, like Scott did with Ivermectin, you might prevent another War on Terror. I can't even suggest a downside here, because if ACX actually ran an article "Contra Alex Jones on Huma Abedin blinking sideways" it wouldn't make more psychopaths snap and go on a killing spree, instead it would probably do a small part towards preventing them.

The danger posed by someone accidentally giving credence to a false expert-led theory is much greater than the danger of doing the same to a false conspiracy theory, this has been borne out repeatedly.

Expand full comment

Issues with ivermectin, and with Scott, and a substantial issue with- I won't say conspiracy theories but alternate disfavored theories- in general are

1. You only find evidence for those things that you look for evidence for. Only theories that you initially determine to be worth your time and energy will EVER have evidence in your mind. Any hypothesis that you dismiss out of hand is not dismissed because of lack of evidence. The dismissal is the cause of the lack of evidence.

2 The ability to detect fraud/bias and the ability to synthesize evidence and weigh it are not connected skills at all. Lots of people can tell when they are being lied to. Example: An MD friend who has been practicing for 35 years and hears Tony Fauci describing chloroquine as dangerous and untested knows that this is not a medical opinion. It is an opinion offered for some other reason, Dr Fauci's qualifications notwithstanding. He will not forget that he was lied to and that politics was passed off as medicine no matter what studies say about chloroquine and covid. In this category is refusing plausible, medically safe treatments in summer '20 because of a lack of RCT. There were no randomly controlled trials of anything in covid patients in May 2020, but patients needed treatment. The protocol was not based on RCT so lack of RCT was a false reason to reject treatments. Knowing medicine and knowing when you are being lied to and being able to prove it are not the same thing. Confounding them is a mistake. Epistemology is downstream of BS detection.

3 Pretending that studies are conducted in a vacuum without consideration of what benefits the career of those conducting the study or the sponsor is ridiculously nieve. Studies with fraudulent results or reports that don't match the data are found repeatedly, often years or decades later, and the error was very frequently in a direction suggesting nefarious intent.

Expand full comment

The research record on school vouchers is terrible

Expand full comment

While not being a fan of any fringe theories, I think historically, "trust the experts" does not lead to all that great outcomes.

All expertise is some mixture of of domain knowledge and a high status echo chamber. It is very difficult for an outsider (or even an insider) to judge the mixture for any particular field of expertise. Take modern physics, which gives us semiconductors and thus ultimately smartphones and the like. Seems like excellent evidence that the field is not a pure echo chamber (and indeed I am of the opinion that STEM is mostly domain knowledge and not echo chamber).

Now consider a peasant in ancient Egypt. While the miracle of life-sustaining iPhones caused by quantum mechanics might seem impressive to us, the miracle of the flooding of the Nile will be much more central to sustaining life in Egypt. I assume that amongst ancient Egyptians, the consensus that the Pharaoh (with the help of some other God, perhaps) causes the flooding of the Nile is much stronger than our consensus that QM causes iPhones.

Yet the Pharaoh an his priesthood, were -- not to put to fine a point on it -- full of shit, basically 100% echo chamber. Forget causation, I would trust the observations and gut feelings (wrong as they may be) of a Nile farmer over the priesthoods predictions regarding Nile flooding any day of the week.

So we would advise the peasant to perhaps pay lip service to the priesthood but ignore them as a source of cosmology.

Same thing for the medieval serf. "Your priest has no more knowledge about the existence of God than any fool, and if he tells you that it is thus just that you are a serf and not a free man, he is wrong. Think for yourself, you can hardly to worse than the experts."

Same thing for a peasant during the Great Leap Forward. "Mao knows less about agriculture than any farmer, disbelieve anything he says."

Suppose that you know you will teleport to some random time period. What epistemological rules could you follow to decide if a group of experts is right?

* "Anyone providing exponential growth is probably for real". (Excepting hyperinflation.) A good heuristic, but too narrow. You might conclude that the design of the light bulb (supposedly brought forth by a mythological figure called Edison) is mostly ceremonial as it was not substantially improved in many a decade.

* "Anything you are not allowed to question is probably bullshit" is another heuristic. But you might conclude that as comparative theology is much more welcoming to heterodox opinions than particle physics, the former is much more sound than the latter.

* "A field of endeavor where anyone may practice the craft is probably more solid than one where the law restricts who may practice it" is another worthwhile heuristic. Exceptions include modern medicine (restricted) versus faith healers and the like (mostly unrestricted).

From what I remember, when Kant wrote "Was ist Aufklaerung" at the beginning of the enlightenment period, telling people to think for themselves, he was not thinking very much about possible failure modes.

By contrast, EY -- whose sequences I am (re-)reading at the moment -- seems very concerned with possible failure modes of people thinking for themselves.

Personally, I would rather live in a society where people try to think for themselves even it a substantial faction of them ends up believing in QAnon, blood libel, flat earth, young earth creationism, cold fusion, covid denial, climate change denial and so on than in one where almost everyone just trusts the experts. Still sucks for them and any people they kill.

Expand full comment

> Probably something like “make a principled precommitment never to disagree with prestigious institutions until you are at least 30 and have a graduate degree in at least one subject” would be good advice

Buahahaha, no way .this would have been good advice for my past self. Here are some disagreements I've historically had with institutions around me

- Frequentist statistics is worse than Bayesian probability theory.

- Various disagreements with religious institutions

- Prestigious institutions in small countries (Spain, Austria) being kind of mediocre

- University modes of teaching are a really terrible fit for me personally. For me reading textbooks is more efficient than attending talks

- The aims of some academic institutions make little sense, and pursuing an academic career is not a good move.

- Keynesian Beauty Contests have predictable flaws, and are less preferable than methods which are ultimately scored against reality, even if they have intermediary steps

Idk, some of these are a matter of taste, but overall I think that preccommitment would have been a terrible idea for me.

Expand full comment
Feb 17·edited Feb 17

Eh, this feels wrong for the exact reasons the earlier post didn't.

I mean, I assume we don't really disagree about what the correct heuristics are, I just think the formulation in section IV is actually pretty bad advice, for a few reasons, mostly boiling down to omission of important concepts, and filling the resulting holes with exactly the kinds of superficial appeals to authority that the previous post correctly railed against.

To name two of those concepts (while not claiming the list is exhaustive):

1. Agnosticism. That's one thing the advice does right, though it's probably not going far enough. You don't need to have an opinion, period. I realize that it may be unintuitive for most, maybe even goes contrary to basic human instinct, but once you grasp the concept, it really solves most of your epistemic problems. We know nothing, all we do is just heuristics, something something bayesian reasoning. This doesn't mean treating all claims equally, like the common caricature would paint it (a caricature traditionally pushed by people insisting you really, really need to have an opinion about god in particular). It means separating knowledge from common sense judgement, and defaulting to the latter in lieu of the former. (And of course the knowledge that's actually available to us is not fundamental truths about reality, but a careful experimental measure and examination of it. I guess understanding how limited and difficult acquiring even this sort of knowledge is is what the "30 and graduate degree" ballpark is gesturing towards, but agnosticism lets you get there without using meaningless, alienating, exclusionary markers like age and credentials.) Defaulting, instead of committing. Assume the scientific consensus is built on the hard work of people who did said hard work while you didn't, but be ready to ditch it without regret whenever reality starts contradicting it. (Also, within the wide concept of science itself, put more trust in things with practical applications you can witness everyday over purely academic theorizing.)

2. Motivated reasoning. The naive "intellect" position may treat conspiracy theories the same way as scientific theories. Fine. Let me propose a slightly more sophisticated version that separates the reasoning about factual claims from reasoning about the process of establishing their validity. Whether ivermectin treats COVID is a valid factual question. (As is the existence of Atlantis, etc.) We're only getting in a conspiracy theory territory once we start arguing whether scientific institutions are unfairly rejecting it. But of course whether scientific institutions are unfairly rejecting things is also a valid factual question, right? Yes, in isolation. Hence, motivated reasoning is what one should be on the lookout for. "Scientific institutions are unfairly rejecting things, scientific institutions reject ivermectin, hence, you should update in favor of ivermectin" is the moment to NOPE out of the conversation, and this alone should guard you from conspiracy theorizing without having to throw out the correct intellectual position towards the purely factual questions out with the bathwater. {Edit: I realized belatedly that the preceding example is less convincing that it could be, so this needs an addendum that the reasoning applies equally well to "Scientific institutions use flawed heuristics, here's my argument for why small studies are more trustworthy than big studies, therefore you should update in favor of ivermectin". Still NOPE, and still can be rejected without committing on factual claims - both whether ivermectin works against COVID and whether scientific heuristics are flawed.} "Doctors don't want you to know about [miracle treatment]" IS a separate theory from "[treatment] has [benefits]", and nothing should prevent you from being open-minded (though skeptical, because agnosticism) about the latter while forcibly rejecting the former. (As a bonus, you can, and, empirically, probably should, reject all arguments from authority of science. This is not the same as rejecting science, if you don't know what science says, you still don't know after hearing someone make overly confident claims about it. If it matters, learn what it actually says. If it doesn't, well, it really is safe to just ignore what you've heard altogether.)

Expand full comment

Talking about "conspiracy theories" makes it sound like this is just relevant to Pizzagate and Illuminati, but your arguments sound quite relevant for normal questions like "should I look into fasting/sauna/weightlifting/Peter Attia even though it's not strongly recommended by my country's health agency?"

Expand full comment
Feb 17·edited Feb 17

>At what point in this process - which second of which day - did it switch from plausible-but-false scientific theory to conspiracy theory?

Part of the issue is that it was probably both at the same time, in a sense.

Ie, at the same time, there were both doctors who had read the relevant studies and thought it was mildly likely to have a moderate beneficial impact in reducing time to cleared infection and overall mortality (or w/e), and political partisans online convinced it's a complete cure that means no one should ever need to take a vaccine and the government is covering it up so they can implant microchips in mandatory vaccines and stretch out the lockdown measures they're using to control the world.

Or whatever.

That's part of the issue here - 'Are these rock formations natural or man-made' is a pretty binary question, but many conspiracy theories are clusters of beliefs and paradigms with many different possible versions.

'Belief that ivermectin works for Covid' is an underdetermined description; there are versions of that belief that were reasonable at the time, and versions that were stupid or insane. And part of the fear (across all aspects of the culture war, really) is that anything supportive you say towards the reasonable version, will be taken as support and confirmation of the insane version.

(Similar for the 'human biodiversity' point - 'At a population average level, Ashkenazi jews have a small heritable advantage on IQ test versus the global average' is a very different belief from 'all black people are genetically inferior in terms of intelligence and civility to all white people worldwide, and this fact entirely explains all wealth disparities and crime statistics and is why you social welfare and diversity programs are pointless and destructive and why you should just not let them into your country to begin with' are, again, very different beliefs, but there is an entire cottage industry devoted to taking support for one of those ideas and applying it to the political project of the other one)

Expand full comment

Re: the chess piece illusion: this is always a funny example to me because the chess pieces actually are different, whenever people use it online. Not out of malice, but out of the way image compression algorithms and contrast calibration on monitors and so forth make subtle changes to images.

There *is* a real version of the illusion that you can see on paper, and it does still work, but the effect is much weaker than what you see here.

By the time the real paper version has been scanned into a file, resized and changed image formats a few times, compressed, optimized for web, and displayed on your backlit monitor, the original symmetry between the pieces has been lost and they're actually just different.

Expand full comment

"Conspiracy theory" is the wrong label. The correct category is "belief contrary to the current scientific orthodoxy." It doesn't have to depend on a conspiracy — the orthodoxy might be wrong for a non-conspiratorial reason, as I think was true of both Ptolemaic astronomy and the orthodox view of ulcers that Barry Marshal refuted. And some conspiracy theories are part of the orthodoxy — there was an international communist conspiracy, arguably at least two of them, Stalinist and Trotsyite — and the French resistance against the Nazis was a conspiracy.

Expand full comment

Re: Scott Aaronson

> As a matter of survival, I *have* to adopt a Kavanagh-like heuristic: “this person seems like an idiot.”

That's perfectly reasonable to do *sometimes*, because you don't have to debunk every conspiracy theorist *yourself*, we should instead cultivate a community whose collective goal is to perform factual analyses on these questions. This seems like the obvious takeaway from Scott's last post on this.

He lamented that he could not find *anyone* discussing what looks like relevant evidence *except* conspiracy theorists. It seems clear that Scott's last post was railing against the collective dismissal of even having to discuss what looks like relevant evidence, not against so-and-so specifically not addressing that evidence.

Expand full comment

Scott quotes a commenter named Alexander:

> When you take conspiracy theorists arguments seriously, it implies a higher prior on conspiracy theories than when you dismiss them out of hand. This can lead to your readers (consciously or not) increasing their priors on conspiracy theories and being more likely to believe future conspiracy theories they come across.

I once read about a proposal to change how high school science classes are taught in the US. The most common way of teaching high school science is to lead with experimental results (i.e. accepted facts) and have students do a few experiments along the way.

The proposal was to reverse this, by leading with students doing experiments, and only consolidating students' knowledge about the results afterward. I think it was meant to encourage students to understand 'science' as methodology, not accumulated experimental results.

Would this alternative way of teaching lead to students decreasing their priors on experimental results being true?

Expand full comment

"how an extremely pointless question - whether Abu Bakr or Ali should have been political leader of the Arabian empire in 632 AD - produced the Sunni/Shia split . . ." I've loved Scott's work and his online persona for a good while now. But this is . . . it's not good. Why not apply some of the humility and humanity that are among the glories of his work to this question--or just leave it alone? The fact that such huge consequences grew from such an apparently small thing--and surely there is room for the possibility that this perception is an illusion of ours, not that of the people there at the time--should inspire genuine and patient curiosity, not scorn. Granted, no one has time for everything, but then perhaps a respectful avoidance is best--unless you want to proclaim the equivalent for this issue of "only fools and racists believe in Atlantis"--"only fools and tedious academics think that the Sunni/Shia split is anything other than a pure waste of time." The rationalist project can impart real enlightenment, but belittling huge ranges of the human experience isn't part of that.

Expand full comment

It’s almost futile to try to clip a quote from this because practically every sentence is quotable gold. Or possibly (outside view) it feels so good to read because it confirms my existing beliefs. Anyway I’m gonna leave that knot unresolved, with a strong prior most of it is probably true.

Expand full comment

> Is it possible that most of the standard arguments against the idea are dumb and flawed, but the idea really is false?

I liked this one a lot. It's so easy (for me) to fall for something when you can refute the standard arguments and feel a smart and superior contrarian.

Expand full comment

One thing I’ve learned from years of following the skeptic movement (rationalism’s grumpy uncle) is that people make stuff up ALL THE TIME then double down when challenged, then double down again ad infinitum. After a couple decades of reading and witnessing this outrageous claim cycle, the skeptic develops a sort of reflexive, well… skepticism that serves well. It’s not fun to be the party-pooper whenever a new story comes up. Ivermectin doesn’t work because most everything like that doesn’t work. A huge % of new cancer drugs don’t pan out, but it’s painful to rain on hopeful people’s parades. I think rationalists can be a bit wishful (I didn’t say naive) in thinking they can reason with confidence men who just make stuff up. But if Randi taught us anything , it’s that charlatans are real and there’s no depth to which they will not stoop. Uh-oh, that brings me full circle since Randi, like Scott, took the time to carefully and publicly debunk the woo peddlers - I didn’t know that’s where this comment was headed. Amazing.

Expand full comment

Another thing. You said, “ when people do formal anonymous surveys of IQ scientists, they find that most of them believe different races have different IQs and that a substantial portion of the difference is genetic.”. This may have already been addressed in the comments, but when I followed the link to your 2017 post, you talked about the expert consensus around the utility of IQ testing, NOT the heritability of IQ. Did you choose the wrong link?

Expand full comment

The Fideist position argues that Scott is doing more harm than good by giving Ivermectin the time of day. Putting the truth of that assertion aside, the point of this blog is evidently NOT to do the greatest possible good with each post -- and it never has been. If that were the case, based on Scott's own values, every single post should be about existential risk from AI. He even wrote an enormous blogpost giving reactionary philosophy a fair shake. I suspect the people who oppose Ivermectin usage would mostly agree that reactionary philosophy is a much greater danger than the drug (which merely doesn't seem to help in first-world cases).

Of course, you could argue that this blog would probably have less reach if it were a never-ending AI ex-risk tirade. That would put us in the realm of multi-period optimization, where greedy algorithms like "do the most good with each post" are insufficient. You could then pursue a multi-pronged strategy, where you write some posts about controversial topics to drive readership and sprinkle in things you really care about in between -- like AI ex-risk. Of course, your treatment of the controversial topics would need to be very even-handed. You need to build up good will so that people will trust you about the things that matter. You can't simply delegate to the experts, because on the thing you care about most, you may disagree with them -- so you need a nice track record of disagreeing productively with the academy.

Once you see the blog topic as a global optimization, the decision to write about Ivermectin is pretty easy. It's a controversial topic, so it will drive a ton of readership. There's a lot of content on both sides of the issue, so you can flex your analytical muscles in a way you can't with other controversial topics like "the social safety net". It's also timely, making the article extremely urgent to write. The analysis of other controversial issues can wait, because they're not going anywhere. You can use the increased readership and reputation in a number of even more positive ways down the line. The positive effects from using the readership to tackle AI ex-risk will overcome whatever small credence you have lent to Ivermectin.

Also consider the type of reader that you are alienating with this strategy -- the fideists. They are basically a worthless readership. Almost all of legacy media is fideist, so this is going to be a high-churn audience that you have to compete aggressively for. In addition, they're not very useful for whatever other goals you might have, because you can't get them to do anything unless the experts agree.

This might sound a bit cynical, but it need not be. The instrumental usefulness of writing about Ivermectin need not diminish any of the other good reasons to write about it, enumerated by Scott. The overlap of people who read ACX and at least _consider_ Ivermectin is significant and VERY poorly served by existing ways of making decisions. These are people who, for whatever reason, will not simply trust the expert consensus -- and I think Scott's analysis helps them. I wonder if the detractors actually think that Scott's series caused any Ivermectin-related injury. It sounds like they are simply annoyed by the spam, which is always going to be a conflict of interest between media and people.

Expand full comment

Zvi has, in my opinion, the absolutely perfect take on this whole thing, Scott: In short, that (while you may have ultimately been too harsh on *Kavanagh specifically*) your original take is just about spot-on, and your revision—basically "trust the authorities unless you have really good reason not to"—goes way too far in the other direction. https://open.substack.com/pub/thezvi/p/on-investigating-conspiracy-theories?r=4bfc3&utm_medium=ios&utm_campaign=post

Expand full comment

For a first approximation: Check Scott. Check Zvi. Check their blogrolls - though some link to risky stuff - if all falis: Check Matthew Yglesias and Wikipedia. Nyt: for entertainment. Substackers have skin in the game: their Reputation is on the line.

Expand full comment

For a first approximation: Check Scott. Check Zvi. Check their blogrolls - though some link to risky stuff - if all falis: Check Matthew Yglesias and Wikipedia. Nyt: for entertainment. Substackers have skin in the game: their Reputation is on the line.

Expand full comment

"Which second of the day"? Not sure what point (humorous or otherwise) you intend to make here. Lakatos view of demarcation would be that with previous positive studies being discredited, there was a point (defined by a particular new and better study in a series of studies) when the theory became degenerating. The protective belt of study methodological issues can no longer protect the core theory that the drug works. I doubt that "second" is all that murky.

Expand full comment