192 Comments

Looking at this from the outside, with its recursions and metaphors-about-metaphors, its digressions and aphorisms, its semantic arguments it’s hard not to get the feeling of a psychiatric patient (a mad, multiple personality patient, with too many pseudonyms) desperately trying to avoid acknowledging something obvious.

(Criticism of criticism of vcriticism... could easily appear in that cult classic _Knots_ book by RD Laing.)

Scientific debates don’t look like this. Nor do political debates, or philosophical ones. The patterns look like some strange combination of the three. And it’s the endless aporias that’s strike me the most. Nobody seems to feel satisfied at the end. The fact that people can’t help do it over and over, despite the frustration, suggests a form of repetition compulsion. Scott’s earlier analogy to criticism as a kink is perhaps the right category, if not the right specific diagnosis.

I want to suggest to the community that maybe, just maybe, they’ve been participating in a folie á N, a group madness. One that, from which, a part of the whole occasionally tries to escape. Some parody of a therapy session emerges, with people taking on different roles, but no change in the madness happens.

Expand full comment

The point about evangelism is well taken. The critic mentioned they didn’t believe in Baha’i themselves - which is even more relevant than whether Baha’i itself is true or not. I also find it refreshing when philosophies I already don’t agree with expend no effort in changing that fact - but I don’t think that choice by itself is particularly noble for the reasons Scott gives.

Expand full comment

FWIW, I saw the sense in prediction markets after Eliezer and Linta's current fic portrayed how they'd work in detail (even if fictionalised).

Made it clear how, for instance, they close the incentive feedback loop in scientific funding. The current bureaucratic setup seems even more incredibly and much more *obviously* broken now.

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

Yeah, to add to the defense of evangelism, Penn Gilette (atheist magician of the Penn and Teller duo) once said something to the effect of "If I were going to be hit by a bus and didn't see it, would you stand there and say 'I wouldn't want to make them uncomfortable'? At some point you're going to tackle me."

I have to imagine the Baha'i faith doesn't really have a concept of hell... But even beyond that (and I do think Christian evangelism can often overemphasize hell), it just makes me think your faith isn't that compelling if you don't have a desire to share it. (The fact that many Baha'i ignore this precept to not evangelize is a good sign though)

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

I just want to register my frustration that this post didn't abandon format and call itself "Highlights From The Criticism of Criticism of Criticism of Criticism". Too far is not far enough.

EDIT: Oh, and the quote of Alex's post is broken, the first paragraph is outside of the blockquote-marking bar. And I recognize that this is Carlin's gaffe and not yours, but the religion is called Baha'i.

Expand full comment

This reminds me of a bravery debate. Everyone is fighting against a real but different opponent, and gets confused because all the opponents share the same name.

One opponent is "confusing a good cause with the need for expertise and evidence." Nobody wants that.

Another opponent is "scorning innovative work just because attempting something new makes you seem arrogant." Nobody wants that either.

And another opponent is "mistaking superficial work for effective work." Look, another thing everybody doesn't want!

I don't have a solution, alas, except to agree with Scott that specific illustrations help.

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

"The EA movement is obsessed with imaginary or hypothetical problems, like the suffering of wild animals or AIs, or existential AI risk, and prioritizes them over real and existing problems"

Two obvious counters to this:

1) I can guarantee you that if you prove to a given EA that a given problem is not real (and won't become real), he/she will stop worrying about it. EAs care about AI risk because they believe it *is* a real problem i.e. may come to pass in reality if not stopped. It's not an *existing* problem in the sense that a genocidal AI does not yet exist, but that proves way too much; "terrorists have never used nukes, therefore we shouldn't invest effort into preventing terrorists from getting nukes" is risible logic.

2) I happen to agree that animal suffering is not important, due to some fairly-involved ethical reasoning. But... it's not "imaginary or hypothetical", any more than "Negro suffering" was imaginary or hypothetical in the antebellum US South. You do actually have to do that ethical reasoning to distinguish the cases; "common sense" is evidently inadequate to the task.

Expand full comment

The evangelism/having a discussion boundary does seem to have more to do with “do the people talking to each other have mutually reconcilable value systems?” rather than any intrinsic properties of the words coming out of either of their mouths.

If yes, then the listener can slot the information being spoken into their ears into a corrigible framework, frictionlessly, inception-style—almost if they had already always believed it to be true.

If no, then some flag will eventually get thrown in a listener’s mind that *this* person is one of *those* people that believes that *wrong* thing, and they’re trying to convince me that *wrong* thing is *true* when it’s *not*.

In this way, literally describing something within the framework of a value system that is incompatible with another can be interpreted as an attack by that other (or, more weakly, evangelism). The crux is foundational.

Having wasted a youth arguing with folks on the internet, I’m fairly pessimistic about truly having conversations with folks that I know to have these “mutual incompatibility” triggers. You basically have to encode your message in a way that completely dodges the memetic immune system they’ve erected/has been erected around their beliefs. Worse, knowing that you’re trying to explicitly package information in a way that dodges their memetic immune systems makes them even more likely to interpret your information as an attack (which, honestly, can you blame them? You’re trying to overturn one of their core values! Flipping some cherished bit! People actually walk around with these bits!! Any wedge issue you can possibly imagine cleaves a memeplex in half around it!)

This will be foundationally problematic for any organization that’s explicitly trying to manufacture controversial change. People don’t want to flip their bits.

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

edit: I read too fast and mistook this list for something from a serious blog post rather than just an example for a comment. As a result, this comment is probably overly harsh in its phrasing relative to the target. Feel free to skip it.

I think I was thinking something like "I'd just ignored that list but if Scott is citing it I guess I'd better respond in detail."

Original comment: Yeah let me elaborate why the "paradigmatic criticism" made me scoff:

- "giving in the developing world is bad if it leads to bad outcomes and you can't measure the bad outcomes" so ... don't give in the developing world? Ever? or measure better? measure better how? At least point me at the book to read and give a one-line summary of recommendations to address this, because this is clearly not a recommendation followed by the non-EA charity space anyways.

- "this type of giving reflects the giver's priorities" :very sarcastic voice: really??? charitable giving is decided on the basis of the giver's interests? yeah no shit, it's my money. The whole point of EA is "Do *you want* to do the most good?" This is inherently anchored to the giver's value system.

- "this type of giving strangles local attempts to do the same work" see I know the examples this is referring to but this is one case where it would have been worlds better to give at least one example because as written this is beat for beat equivalent to actually used political arguments to abolish literally every social safety net. Stop sucking the government's teat! Starving African ... welfare queen!

- "The EA movement is obsessed with imaginary or hypothetical problems" ... "Stop wanting wrong things for no reason" has literally not convinced any human ever in the history of the planet. I now disagree about the noncentral fallacy - this argument is the worst argument in the world.

- "The EA movement is based on the false premise that its outcomes can in fact be clearly measured and optimized" okay, um, how do I say this, have you read the Sequences? if you can't optimize an outcome, you cannot do anything whatsoever. so like, sure, but absent that there's also no basis for your criticism? How are you saying that the EA charitable giving is *bad*? Did you perhaps model an outcome and are trying to avoid it because it's bad? Yeah that's optimizing, optimizing is the thing you are doing there, as the quote goes, now we're just haggling about utility.

- "The EA movement consists of newcomers to charity work who reject the experience of seasoned veterans in the space" Yes.

- "The EA movement creates suffering by making people feel that not acting in a fully EA-endorsed manner is morally bad" I believe this is called "being a moral opinion", yes. edit: Am I endorsing this? No, I just think it cannot be fully avoided. Moral claims cause moral strife.

And like. Maybe this is uncharitable and the book-length opinions really have genuine worth and value and should be read by everyone in EA. But if they do, none of the value made it into this list! Clearly whatever the minimum length for convincing literature is has to be somewhere in this half-open range.

Maybe submit a book review?

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

>So maybe my thoughts on the actual EA criticism contest are something like “I haven’t checked exactly what things they do or don’t want criticism of, but I’m prepared to be basically fine if they want criticism of some stuff but not others”.

This feels like a Motte and Bailey issue. When I read the rules the preamble states pretty clearly that a wide range of topics are welcome, but the rule minutia make it hard to address broader paradigmatic issues. The organizers can call the winner The Best Criticism of EA even though they have implicitly limited the potential criticisms they receive to prevent ones they may be uncomfortable with.

To build off your example, imagine on the next Sunday the pastor comes back and says "We judged 'my voice is too quiet' as the winner of the Criticism of Christianity contest, paid the winner $5,000, and bought me a new microphone". Sure he got some criticism and received it, but the overall framing of a contest implies that this was the most important criticism to address, and conveniently an easy to address one won. He can say he addressed the biggest criticism, while at the same time not addressing the "God isn't real" criticism.

Expand full comment

> The universe was thought to be infinitely large and infinitely old and that matter is approximately uniformly distributed at the largest scales (Copernican Principle). Any line of sight should eventually hit a star. Work out the math and the entire sky should be as bright as a sun all the time. This contradicts our observation that the sky is dark at night. This paradox was eventually resolved by accepting that the age of the universe is finite

People still bring this up as an unresolved paradox, which I've never found particularly convincing. But I don't see how a finite age of the universe is supposed to be a resolution. According to this line of argument... why are some stars brighter than other stars? Why is the age of the universe relevant? Are all the stars we can see constantly getting brighter, because the age of the universe is increasing?

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

I read the 'evangelizing' comment as being related to some EA practices and found it difficult to imagine it could have been meant to apply to Scott's blog. (though still possible of course)

I agree 'there’s no clear line between expressing an opinion and evangelizing' and I also agree that telling everybody about that 'thing' that is so important to you can be misunderstand even if you don't want to convert them ... but I still think it's something like a continuum with some things (more) clearly being on the evangelizing side and some on the 'not evangelizing' side. 'Writing sth. on your blog that people visit when they want' or 'sending free printed copies of HPMOR to folks uninvitedly' sure seems different. I'm also not at college, but I lately read a bit of specific criticism of EA's recruitment work, and I can understand why some folks (apparantly) could find it cult-like.

Which doesn't take from the point whether you find it necessary or whether it's more or less effective in spreading your ideas.

And I really liked the last sentence.

Expand full comment

> but I’m prepared to be basically fine if they want criticism of some stuff but not others.

Fine with me, but it would be useful if the contest would be explicit and clear about this. This would have several advantages: People would know what to submit and what they could win a price for. Contest organizers would have a higher chance to get what they want. And EA couldn't use one kind of criticism to fend of another one or pride themselves to take all kind of criticism when they are looking for sth. very specific.

A priest who respectfully engages in an open debate with folks who find religion appealing but also can't belive in God, gets a different kind of acknowledgement from me than a priest who asks how to deliver his sermon in a way to be best heard.

Expand full comment

There was no first person to consider abolitionism. There were lots of people who did not want to be slaves. But the grinding poverty of the past meant every political economy on Earth was based on slavery through 1600, though it was a lot less basic to places not invaded recently, like England in 1600. Around 1600 in England, when North African Muslims were raiding Europeans in general and sometimes English sailors for slaves, lots of Brits said this was bad and English should not be raided for slaves. It was taught in schools and preached from pulpits. No Englishman should be a slave. Blah.

And when everyone is preaching blah, some preachers strut their stuff and say blah blah blah. If it sounds studly, other preachers will go along and preach blah blah blah. Not just no Englishman should be a slave, but no Englishwoman. Nobody should be a slave! Over the next couple hundred years it slowly caught on in England. Slowly, because enslaving outgroup was so profitable.

1600's through 1800's British Isles were a perfectly placed pirate base against everyone else in Western Europe. Piracy had a moral hazard, but was so profitable they ended up with a British Empire. Maynard Keynes thought the British Treasury was founded by Drake's piracy. They had to keep it. How to justify the moral hazard? The Black Legend of the Bad Spanish Empire of slavers worked okay. Meanwhile the pirates were taking and trading slaves like crazed pirates, and making bank, enough to shift from raiding to trading, enough to be governed from London. The empire they were deniably building together could point to outgroup's nasty slaving ways, and Brit slavers were ingroup enslaving outgroup.

The 13 colonies of piratical slavers and a lot of British poors wanting a better life prospered and became widely known across Britain as 'the best place in the world for a poor man' (per Bernard Bailey). When they were poor London let them handle their own affairs. By the 1750's they were building (I think) a third of the British merchant marine and worth governing by their betters. George Washington exceeded London's orders (while following Virginia's orders, and supported by Whigs in London's government) and attacked Fort Duquesne, fortified by the French against British (mostly Virginia British) expansion. He lost and was taken prisoner, but was released and not punished by Virginia and supported by Whigs. The Brits came back and took the fort, Fort Pitt. Washington had triggered the Seven Year's War between France and England. England won. Now to handle the poors affairs. The poors liked handling their own affairs.

The colonies revolted and all thirteen fought for eight years of fairly nasty war. Long nasty wars have a high moral hazard and need justifications. The Tory Samuel Johnson, already toasting the success of the next slave revolt in the West Indies, wrote a good polemic against the revolting colonials- 'Why are the loudest YELPS for liberty from the floggers of Negroese'? and John Wesley stole it. Johnson was happy 'to have gained such a mind as yours confirms me' and the Methodists preached Wesley's patriotic Brit sermons against the colonials and against slavery. For the next hundred years the British Empire preached abolition as a justification for bagging any profitable area that looked easy and, like everywhere, was based on slavery. 'Castlereagh, the name is like a knell' bribed the Spanish Foreign Minister with 100,000 pounds to abolish slavery in Spanish America, triggering a revolt in Spanish America that opened Spanish America to British trade. And ended slavery in Spanish America.

Even the revolting colonials gave up slavery, not least because the moral hazard of slavery made it less profitable as the Industrial Revolution got going. Also the Black Legend of the Evil Spanish Empire helped justify grabbing Florida, and then also the northern wilderness loosely held by New Spain. Everyone has been an abolitionist since.

Not from one pushy evangelist, but from a mix of self-interest and genuine moral choice and a lot of preachers and teachers. Like EA.

Expand full comment

Hypothesis: The demand for criticism of EA is larger than the supply of good criticism of EA.

Expand full comment

> the trick is evangelizing without making people hate you. I’ve worked on this skill for many years, and the best solution I’ve come up with is talking about a bunch of things so nobody feels too lectured to about any particular issue.

I think this is also related to some of the writing advice you gave in https://slatestarcodex.com/2016/02/20/writing-advice/ especially regarding how to talk about potentially incendiary topics.

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

I really like the example of the Priest. But I think the potential criticisms of 'God doesn't exist' or 'buy a better mic so we can hear you better' reflect extremes and need additional examples or miss a point.

Again, suppose there is a Priest who wants to see more people attend Sunday service and is also worried that people are leaving. Ultimately he is interested in more people believing more strongly in God. He is asking for criticism and what he can do better.

He's maybe hoping for 'make your sermon a bit shorter' or 'hold service an hour later on Sunday morning and I'd be there'. But instead he may get criticism like 'you're driving this big car while preaching poverty, I don't want to listen to you preaching X while doing Z.' or: 'I believe in God, but I'm appalled by the cases of misuse that took place in your ranks. Write an open letter to your Bishop to fully investigate those cases, then I'll be happy to attend your service.'

I think the Priest is fine to reject the criticism of 'there is no God' - this is the one thing he cannot give up to and still be a Priest. And anyway those guys will never end up in his church.

The example of 'voice is too low, buy a new mic' found in one of the comments is in some ways extreme in the other direction: the Priest can easily solve this with limited ressources, it doesn't require any behavioural change from him, no whatsover change in 'paradigms' and also no loss of status or comfort. It's probably not even a criticism he'd feel uneasy about - compare 'you need a new mic' to 'your voice is unpleasant' or 'your sermons are chaotic and can't be understood'. Simple solution and win-win.

But what about the third category? Preaching poverty and love-thy-next while living in prosperity and making use of luxury goods not available to many others? Or not reacting to the cases of misuse in his own ranks? I think those examples are closer to the 'paradigmatic' criticism that we're talking about in EA. It requires real changes in thinking (all priests I know do it, but is it really okay to drive this big car, while preaching poverty? Am I allowed to criticize a bishop?) and behaviour and it risks loosing the support of other important members in the organization. While not giving up on what is the (most narrowly defined!) core of the issue.

I would argue that those are the criticisms the Priest should hear. Or more precisely: I think it's the Priest's decision to ask for improvements of his sermon only and implement them. Arguably that's already more than what most priests are doing. But I think it's a missed opportunity to not listen to the complaints about his affluent lifestyle and the apparant 'sins' in his owns ranks. Especially when you care about people coming to your services and believing in God.

As mentioned, I think those examples are closer to 'paradigmatic' criticism of EA. And I think they are worth being heard. Especially if they come from folks being close to the (again, most narrowly defined!) core value of EA.

Expand full comment

I'm intrigued by the idea that wild animal suffering is hypothetical or imaginary. Do wild animals not exist? Are they all living idyllic heavenly lives without suffering? You don't have to care about animal suffering I guess, I mean I still eat meat so clearly I don't care that much, but imaginary just isn't correct.

Expand full comment

Tbh I think the term “evangelize” is the combat meme du jour. It’s exacerbated by cultural propriety around online safe space that varies between platforms. So you’re either be accused of evangelism because:

A. Someone feels their community bounds have been violated.

B. Someone feels personally attacked by your views and rationalizes that by externalizing it into “my community bounds are being violated.”

Reddit and twitter accuse people of evangelizing a lot because of a need to guard very porous cultural boarders with big dramatic virtue signaling. It serves the dual purpose of keeping you “out” of their safe space and warning others like you away from it. People who come onto *your* platform and accuse you of evangelizing are just doing the very human thing of believing they are the only ones entitled to ideological borders.

Expand full comment

Y'know what's odd? When I criticize pop culture or our news coverage or whatever, I'm definitely not lying. I genuinely think that the way we process narrative in modern society is unhealthy to the point of doing massive social damage. But also I enjoy doing it. It makes me feel like I noticed something others missed (even when my point is trite and cliche). Criticism always elevates you - at the least, it makes you seem smarter than the thing you criticize.

Zvi's a careful thinker, and I don't think he just wrote a criticism he doesn't believe to write one. But he's also clearly reveling in the recursive nature of this discussion, the fact that there are deeper levels to explore, and also he's hoping to win a contest. To some degree criticism is a form of entertainment - it can show mastery of a subject in a way that a straightforward statement of principle can't. And the more abstracted that criticism is, the broader the mastery feels.

When I read your discussion of psych's "woke" criticism I thought to myself, "That's different. It's just virtue signalling to people who already agree with it all." But I'm not sure it's different. It's possible, I think, that broad criticism signals in-group prestige a lot more easily than the difficult work of building out specific ideas. Especially when I can then just say "oh your specific idea is just a new version of this general, unfalsifiable trend I already discussed. Checkmate, people who try to do things." (See, e.g. TheLastPsych).

Expand full comment

>I think there’s something where whenever a philosophy makes unusual moral demands on people (eg vegetarianism), then talking about it at all gets people accused of “evangelizing”, in a way unlike just talking about abortion or taxes or communism or which kinds of music are better

The difference is this. If I’m going on and on about being vegan, my non-vegan listeners see me as setting myself up as their moral superior, someone who rejects their lifestyle in favor of my own more virtuous one. So I’m viewed as arrogant, judgmental, and condescending. But if I’m going on and on about how great John Coltrane is, then worst case I’ll be seen as a slightly boorish fanboy, and best case as a friend who just wants to share some cool music. It’s hard not to see someone regularly touting the virtues of EA as similarly putting themselves on a moral pillar, in a way that an economist defending some wonkish tax policy is not.

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

[Re-using a nested reply because it works as my main reply]

As a very long time reader of the blog, I pull Scott. EA pushes me. Baptists push me. US Government actually doesn't push me, except maybe on tax day or when I'm driving 90mph in a 70mph zone.

Sure, maybe there's slick pushing and there's blunt pushing and the former manages not to feel like pushing. Sure, maybe the oncoming bus push or the abolitionist push are worth it, but it's a really, really high bar to clear.

Scott, it's ultimately as simple as that: who pushes, who pulls.

Since a few EA topics are probably abolition level to you, they probably clear your bar. Since they're not to me, they don't clear my bar.

Edit: Comments are all evangelism. My comment was evangelism. Conversation is evangelism. I should have added this exception to my original post: I grant that a certain amount of positional conversation is necessary, tolerated, or even welcome. Having a comments section on a post is asking for a certain amount of positional discussion, and that's cool up to a point. But, note! Even in your comments section, some people have created a general consensus that they push certain specific points too much too many times.

Expand full comment

IME Ba'hai people really are very good about not evangelizing. They do also tend to be hippies who make a lot of anomalous and possibly detrimental life choices, but seem fairly happy.

Expand full comment

I see Scott's point, that EA, unlike Bahai'ism is doing a lot of urgent and important work, and their message must be spread ASAP. I also see Matthew's point, that I almost always feel repulsed by people trying to push an ideology down my throat, however important they claim it is. I pay Scott to read his views. Hence, I don't feel the same kind of repulsion when I read his posts.

Hence, perhaps the only way to evangelize about important ideas without being repulsive is to first become so widely respected that people ask/pay you to know your views.

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

>This is actually seriously my point: there’s no clear line between expressing an opinion and evangelizing.

I think the correct answer is to bite bullet and admit, yes, everyone who expresses opinions publicly with intention of changing other people's opinions is evangelizing, either implicitly or explicitly. BUT one can draw distinction between different qualities of evangelizing: Am I evangelizing politely or forcefully? Am I evangelizing manipulatively by misrepresenting my arguments or intentions? Am I evangelizing my idea of "all publicly expressed opinions are evangelism" too forcefully to the detriment to other ideas I also implicitly also want to evangelize by writing my reply, such as, "maintain norms of charitable argumentative discussion"?

It is not like it is impossible to have opinions and not evangelize: one can hold their opinions private, revealed only to chosen people on explicit request.

One can step up to minimal evangelism by imparting publicly only minimal amount of information necessary for the improvement of mankind. An EAian whose message is "it is a good idea to weigh effectiveness of a charity by experimental method" is imposing their opinion less than another who tells everyone their opinions on animal suffering, human suffering, optimal sexual behavior, AI risk, and particular best charities. And a third EAian who can restrict themselves to saying only "experimental method is useful in many domains" is imposing their opinions even less than the first one (because the listener may then think for themselves what the application should be).

Expand full comment
Jul 28, 2022·edited Jul 28, 2022

"'The Anti-Politics Machine' is standard reading in grad-level development economics (I’ve now had it assigned twice for courses) -- not because we all believe “development economics is bad” or “to figure out how to respond to the critiques” but because fighting poverty / disease is really hard and understanding ways people have failed in the past is necessary to avoid the same mistakes in the future. So we’re aware of the skulls, but it still takes active effort to avoid them and regularly people don’t. "

No. I'm not going to accept this argument. Consider its elements:

- Experts are familiar with this criticism.

- Experts use this criticism to avoid future failures.

- Expert's failures should be excused because our job is hard.

- Experts cannot be replaced with regular people.

I don't care. This argument can fit any number of fields, from social science to scapulimancy. It is not evidence that your field has any intrinsic value.

Show me the money. What has your discipline achieved? Does your discipline show evidence of a cumulative growth of understanding? Or is it merely chasing one intellectual fad after another? Do basic foundational questions in your discipline remain unresolved over decades, without one faction mustering sufficient empirical evidence to conclusively put the question to rest? Does your discipline act as a priesthood for a political faction, manufacturing justifications for political issues as needed?

Expand full comment

Does the EA movement have some FAQ where they respond to Alex's comments (#7) ? Because he pretty much listed all the reasons why I don't donate to them...

Expand full comment

EA is a counterproductive force that wastes rationalist talent and causes active psychological harm to the compulsively scrupulous. Altruism is an inherently self-contradictory concept, you can't cross the is-ought divide, the only rational outcome of an analysis of altruism is well summarized already... https://en.m.wikipedia.org/wiki/Moral_nihilism#The_scope_question

Expand full comment

Was the thumbnail image: "DALL-E, mindfuck me with a kaleidoscope please"?

Expand full comment
founding

I think the central reason Baha'i do not evangelize is because they believe there are a fixed number of Baha'is over time. This was explained to me by a friend when I was living in Israel and visiting a Baha'i village; I am not Baha'i though so take with a grain of salt.

When one Baha'i dies (the story goes), another is born in a Baha'i family or is converted. If you're born Baha'i, you're always Baha'i, there's no way out as far as the counting goes.

This presents a paradox: populations are growing with time, and if there's a way to become Baha'i but no way for anyone to leave, then any convert will seem to violate conservation of Baha'i.

How do they get around this? I kid you not, the story is that there is some large, undiscovered island with many, many Baha'i, whose population is decreasing with time. So there's a sense in which converting someone to Baha'i is like killing one of the Baha'islanders (I'm not sure if it's phrased quite so starkly by them).

Wild stuff!

Expand full comment
Aug 1, 2022·edited Aug 1, 2022

#8 about evangelizing got me:

1. Yes there are grey zones, but there are some things that are (in my opinion) not evangelizing:

- answering direct questions about my opinion

- participating in a conversation/discussion/debate everyone involved is enjoying.

- just living / acting your believes without disturbing others more than necessary.

2. so according to 1. Matthew was not evangelizing, he was even explicitly asked to explain why he was skeptical not only of the moral but also the effectiveness of evangelizing. It just seems like it triggered something in You Scott.

3. I don't know if or how EA is evangelizing, as I know it only from this blog. But I fully agree with Matthew about evangelizing in general, it fully reflects my live experience. It even seems a universal principle that any push or force causes defence (flight or fight) and that any fleeing or hiding makes curious and invites to follow. This applies not only to humans, but can also be observed in many animals. There is even a whole school of taming horses by following them slowly in a closed area so they flee you and move back, and then moving back yourself causing the horse to follow you and come to you.

4. When talking about 'philosophy makes [...] moral demands on people', the difference is whether you make these demands from your audience. To stay with your example: I'm vegetarian for 25+ years now and I was never accused of being evangelizing. I don't make a big deal out of it, just stating the fact when I'm asked of offered some meat. When people ask me why, I explain my case without implying that everyone should be vegetarian or even explicitly stating that this is everyone's private decision that I respect. This way less people will hear my arguments, but the people hearing them are much more open and let them sink in. Additionally this does not build up biases and negative stereotypes against vegetarians, and people making moral decisions in general. The few people that still get triggered either have had bad experiences with evangelizing vegetarians or have a bad conscience for eating meat, which is not healthy and I would like to help to live happily however they continue their life.

5. even if it is something really relevant like hell or the super-plague it doesn't help to push much. People will just be annoyed, call you a conspiracy theorist or freak and won't listen but avoid you. It could be much more effective to just give hints and make people curious, so they ask you and really listen. (Some say this one of the reasons why Q-Anon is so widespread: they only give hints and make the people research and put the puzzle together on their own. The resulting picture is different for everybody, but this doesn't matter much as these things are hard to verify anyways.)

In case of hell you have to respect the decision of everyone, as one of our main gifts from god is our free will. In case of the super-plague you can make a plan of action with the people listening and willing to join you. If all fails nobody will judge you if you tried your best, so you also shouldn't judge yourself.

6. You won't convince me to see prediction markets as a solution to relevant problems. I suppose reading about it would be more interesting if You wouldn't try to convince me. But as I'm here by free choice and I don't have to read every article, everything is fine, thanks for your great work ;-)

Expand full comment

Bit late to this whole party, but:

> I don’t think a pastor who asked for criticism/suggestions/advice about basic church-running stuff, but rolled his eyes when someone tried to convince him that there was no God, is doing anything wrong, as long as he specifies beforehand that this is what he wants, and has thought through the question of God’s existence at some point before in some responsible way.

I wholeheartedly agree. I also think that if the contest was intended to be taken this way, it should have said so explicitly, and not through the subtext of its fine print.

(I also loosely endorse the comments that TM made before me on this subject.)

Expand full comment

I think the biggest difference between evangelism and persuasion is how "load-bearing" your ideology is.

Going back to religion for a second, before you start believing that it's important to save people from Hell, you first have to believe that:

1. God exists (with all the implications that brings)

2. Life after death exists (ditto)

3. Hell exists (ditto^2)

4. People end up in hell after they die if they don't believe in God or follow his principles

5. You have the right interpretation of what God and his principles are

6. You're sufficiently persuasive or charismatic that you'll be able to persuade people to believe in God and thus avoid hell instead of just annoying them (or you feel you can bear the social cost)

That's a lot to swallow. Try to force all that on someone at once, and they'll probably choke.

Contrast that with GiveWell, an EA organization that I think actually does a good job of being persuasive. Their argument (at least for their top charities) is something more like this:

1. There are a bunch of awful diseases in the developing world that cause people immense suffering, often death.

2. A lot of these diseases have pretty cheap preventative treatments (e.g. vitamin supplementation) or tools (e.g. bed nets).

3. Problem is, people out there are so poor that they can't afford these treatments.

4. But you can. You can buy one of these treatments for (usually) double-digit dollars or less.

5. We estimate that on (very rough) average, if you buy X thousand dollars of this and spread it out over Y hundred people, you'll likely have prevented at least one person from dying (or at least several people from suffering badly).

6. Don't believe us? Here's some in-depth profiles of exactly how we did our math to prove it, as well as information about the rigorous amounts of data we ask from our charities.

Notice something? Even though GiveWell is founded on utilitarian ethics, there's barely any utilitarianism anywhere in that argument, aside from maybe the (optional) last step. You don't need to swallow utilitarian theory whole in order to be convinced.

The only time utilitarian theory might be important to your decision-making is if you're considering donating to GiveWell *in lieu of* another charity. That's when you have to start making the arguments about which charity does the most good.

But that situation is often irrelevant. The average person doesn't give to charity *at all*, even people who are well-off and could afford it. They don't need to trade off one charity vs. another. They just need an argument that's good enough to get them to justify donating any of their spare money, and a reasonable assurance that they're not being scammed or mislead.

By grounding their work in the results, rather than the ideology, GiveWell opens itself up to people of all sorts of moral codes, so long as that moral code allows for something like "It's good to prevent people from suffering or dying of horrible, easily preventable diseases."

So what does this mean for other more speculative EA cause areas, like AI safety? Probably that people need more convincing, concrete evidence that it's actually a near-ish-term problem. Less "Think of all the trillions of people that may never live" and more "Look, GPT-whatever has already been proven to have X or Y failure mode, here's a convincing paper to show it. That's not a problem now because this thing is small, but if you scale it up and put it in charge of some important part of business or government it's going to end up causing this kind of problem or worse on a much larger scale."

(This also may explain why so many people are concerned about things like AI racial bias as opposed to bigger problems like superintelligence. It's a lot more tangible, and is already causing us some small-to-moderate problems.)

Most people do not "shut up and multiply". In fact, they're resistant to it: the whole "torture vs. dust specks" thought experiment where that was coined is notoriously controversial even among the Less Wrong crowd. Making that the load-bearing part of your argument is arguably evangelism, and is also probably setting yourself up to fail.

Expand full comment
Aug 5, 2022·edited Aug 9, 2022

This is my response to your post: https://twitter.com/LinchZhang/status/1555007124949704704

Expand full comment