19 Comments

Worth noting that there exist precedents to some of this in rationalist- and EA-adjacent spheres, especially in some things Benjamin Ross Hoffman has written; a post that pushes back on some of what you've said or implied might be 'Why I am not a Quaker'. Link: https://www.lesswrong.com/posts/6XvnqW28e2twiv6ww/why-i-am-not-a-quaker-even-though-it-often-seems-as-though-i

Expand full comment

A different response (this is mine, nothing to do with Hoffman) is that Quakerism 'got it wrong' on the biggest moral question of the 20th century: the Second World War. We should be *scared* of a movement that confronts such a monumental issue head-on and comes down confidently and squarely and with few reservations on one side (viz., pacifism) that today we regard as enabling evil. Camus wrote of that era: 'These are the moments when everything becomes clear, when every action constitutes a commitment, when every choice has its price, when nothing is neutral any more. It is the time of morality, that is, a time when language becomes clear...' Language becomes clear when facing Nazism, yet Quakers could not see that clarity.

Expand full comment

While the morality of fighting World War 2 is definitely settled inside the main Overton window, in the contexts of minority viewpoints like EA and Quakerism, and their strong commitments to reasoning out morality, I don't think it's necessarily settled.

The atomic bombs, the military industrial complex, the permanent militarization of the US, the war on terror, and Manichean justifications for numerous immoral political actions, coups, and wars are all consequences of the activist position on World War 2. It's at least debatable that we're not free and clear of the shadow of that war and in the black as far as our choices.

A worthy book that's *not* outside the mainstream Overton window is https://www.goodreads.com/book/show/1948985.Human_Smoke . It details some of the lost opportunities for pacifism and level headed thinking in the 30s and 40s, and is worth a read if one wants to really test one's commitment to the "WW2 was right" position.

Expand full comment

Yeah this is a great point. It's related to my point that they shouldn't run nations, which is a notable limitation. I wouldn't want to exist in a Quaker world, but we are overall absurdly enriched by their contributions. Being pivotal in the industrial revolution and the ending of slavery (amongst much else) is a legacy very few movements can point to.

In other words, we can find usefulness and emulation possible without directly copying (I do not want EAs to decide as a whole group that all conflicts are verboten - and I know a few that, for example, funded Ukrainian arms for a while. We should improve on the blueprint - not copy it blow for blow.

Expand full comment

It's hard to be a successful minority. Everything points towards growth or decline. Something that "only works when we're a minority" runs into this problem eventually.

That isn't to say its useless, but it's hard to acknowledge that you can't grow too much without entering into decline. Especially if a universalist ethic is your justification.

Expand full comment

I'm not quite sure what you mean by "you can't grow too much without entering into decline". Are you okay to outline a little more what you mean by this?

Expand full comment

The traditional Christian gospel is one where the message and its actualization are accessible and gifted to everyone. In theory, if everyone followed Jesus's teachings correctly, the world would be a utopia.

But the message above is that EA is something that only a select group can get, and even if more people could get it, it probably wouldn't scale. Much like Quakers pacifism is great so long as someone else is going to exercise the monopoly on violence for you (or for that matter early christians).

It's very difficult to be in a position where you need to constantly convert people, but not too many people, to a message you think doesn't really work if too many people believe it. Especially in the modern era where groups like Quakers (and I'm guessing EAs) appear to have low fertility meaning you're probably going to decline without constant conversions.

Expand full comment

I don't think it's long before EA starts emphasising having children (indeed there are moves that way already).

I also think it's perfectly sustainable to have a belief that converts others, has lots of children, but still doesn't become universal. Mormons, most sects of Protestantism, and indeed Quakerism have survived and flourished for centuries without being the religious hegemony. You don't need to worry about "not converting too many people" for quite some time (if at all), as it would take a long period indeed for EA (or any belief) to be the central influential or defining one for a particular (or particularly important) state.

I'm also uncertain that EA is *definitely* not scalable to leadership etc. - my point is that EA is unlikely to have to consider the ethics of that for some time, and can effect good change even as a minority group.

Expand full comment

+1 on the point about universalism.

Expand full comment

"No. I am unsure that a state run by Quakerism could survive - much like Constantine in his conversion to Christianity, the theory of just wars was necessary to defend a state, and Quakerism would have to come to terms with such compromises - in other words, would have to become less Quaker in order to sustain a state"

You might be interested to know (and even edit in) that there are several substantially different branches of Quakers in America, and that Richard Nixon hailed from one of those branches. It adds some anecdotal evidence to support your conjecture, because, yes, a Quaker has "run the west" in some sense, and he was already compromised before he got there, and he further compromised once he got there.

You might find this book illuminating: https://www.goodreads.com/book/show/1889380.The_Quakers_in_America

Also, I'd say the idea that Quakers simply talked other people into good actions is a motivated or reductive reading of history. And the Quakers would have very sharp disagreements with EA consequentialism and utilitarianism, so it's a strange one to attach to EA.

But I think you captured the basic comparison that Quakers were peaceful, thoughtful, seriously committed and idealistic, and I enjoyed finding out that Cadbury is a Quaker company, so thank you!

[Epistemic status: I'm not a Quaker. I'm just a guy who reads about them. Also, I have mixed feelings about Richard Nixon, including some positive ones to go with the negative, so this isn't an attack on Nixon.]

Expand full comment

Really interesting points, thank you!

By "run the West" I more specifically meant that I don't think Quakerism could survive uncompromised as a strategy for a state in the same way as a dominant religion like Catholicism or Anglicanism. Nixon being a Quaker is a good example of Quakerisms ability to attract successful people but - as you say - Nixon's Quakerism is very difficult to unpick (and I'd like to learn more about it in all honesty!)

Re. attaching Quakerism to EAs, I'm not saying that EAs should adopt Quaker theology, but rather than they should consider adopting Quaker practices and behaviours. EA and Quakerism are not the same in beliefs but they are meaningfully similar in ways which makes Quakerism a close corollary to EA (as I outlined above). This allows EAs to look on a movement that was successful across various domains and seek ways to emulate this - including in ways that might not look, a priori, as particularly important (is silence in meetings, for example, of some value we currently don't consider?)

I think particularly, that EA is one of the first strong examples of a belief system that we wouldn't classically consider a "religion". However there are a vast amount of discussions on how the community should move away from religious practices - I disagree. It seems to me that history provides us with a lot of examples of successful belief systems and they have very particular similarities. Given this, we should consider that - if EA wants to survive into the longterm - it should look at what it looks like when humans believe things in groups (religions) and then work out which of those groups looks most like EA (I think Quakerism is a very good candidate), and emulate some of their practices/behaviours.

Expand full comment

Thanks, likewise!

I should have done a better job explaining that Nixon was a sort of "by-extension" example; if an individual Quaker was compromised when running the biggest thing, it gently suggests that Quakerism might not survive being the dominant system, but I admit it's a very weak bit of evidence for what is already a very intuitively appealing thesis. However, understanding it in detail is useful for understanding on a deep level how the religion actually interacts with the real world.

I got what you were saying about EA adopting Quaker methods rather than Quaker beliefs, but I think I was addressing that. The shallow isomorphism seems very nice, but I think it's not clear that the practical Quaker choices survive coherently when separated from their religious motivations, nor that they fit in a deep way into EA, and I believe it would take a deep examination of differences to allay this concern.

For instance, Quaker pacifism is very close kin to non-evangelism, and indeed the most stereotypically Quaker strains of Quakerism were not very evangelical at all, whereas EA is *very* evangelical. Maybe evangelism is incompatible with the sorts of epistemic humility and practical universal respect that were at the root of Quaker theology (I personally think it is). In that case, these things wouldn't fit EA as well as they seemed on first glance, and adding them might cause a lot of strain.

Expand full comment

That's a really interesting point, thank you. EA is certainly evangelical, and it would be harmful to step away from that. It is a genuinely very interesting question how much evangelism would conflict with other Quaker practices. Or - perhaps - it is worth looking at how Quakers evangelised because they did certainly recruit numbers, and particularly recruited impressive, productive people.

Expand full comment

Because EAs tend to believe they are just "following the science" there is a huge risk of ideological purity spirals. I know a lot of people that "followed the science" during COVID to a total disaster. You can complain all day that they didn't follow "the correct science", but even for someone trying their best I think its hard to know the correct "science". It would be better for them to think that a portion of their deeply held beliefs were actually "faith" so they are more humble about them.

Expand full comment

I am unsure that EAs would claim that they are 'following the science'. EAs are pretty open about how much of the reasoning is formed from intuition, as well as from data and research. I don't think this transparency of baseline assumptions, reasoning and analysis is best described as faith, though I can see some aspects may be closer to that than others (certainly the more intuition or uncertainty, the closer it is to reasoning from something close or akin to faith). I don't think, however, that they should be more humble - most EAs I've met are very humble, giving, interested and smart people who just want to improve the world as best they can. Seems we should encourage that!

Expand full comment

I probably don't know as many EAs as you, and I'm not one myself. So maybe my definition is just off. But when I hear EA I think:

1) Radical Universal Utilitarianism

2) Malaria Nets as quintessential example of that

3) Making as much money as possible and donating all but the absolute minimum needed for your survival to malaria nets (the radical part)

I would say that Universal Utilitarianism is the vague background ideology of the current elite, but a mixture of genuine doubt and human weakness keeps them from following through to the conclusion above.

EAs I think tend to emphasize that through math or reasoning that they've figured out how to perfect this ideology (buy malaria nets, not some other dumb thing). I think such insights can be really good but I'm not convinced this is a slam dunk change in reasoning.

Take something simple. Why are malaria nets utility maximizing? Maybe they will just lead to overpopulation and suffering. Earlier generations of utility maximizers came up with very different answers, and it's essentially on faith that we've rejected them as a violation of human rights.

Why donate nearly all of your income? People donating nearly all of their income can't really afford to have kids. Smart people not having kids is probably bad for utility in the long run.

You're sort of taking on faith that if you solve the immediate problem (malaria) that the rest will sort itself out and that this is the best possible use of resources.

And that's fine. In the case of malaria nets its about as safe as it can get.

But the bigger EA gets, the more it's going to move out of the realm of malaria nets. And that's where I see the danger.

Let's say I'm an EA and I'm existentially worried about Climate Change. In fact climate change is more important than malaria nets. It's so important in fact that I feel fully justified in supporting radical political action that forces my view on everyone. This is justified on utilitarian grounds of course, I've done The Science. I dunno, maybe that person is right. Or maybe they are wrong and think they are right. Kind of like the CDC.

Maybe all the EAs you've met are humble about this, but generally I'm skeptical of people that think they have a novel moral code and a superior reasoning ability that nobody figured out before.

I'm kind of a boring guy on this. I think the historical 10% charity is probably optimal. I think you should mostly focus on local problems and people you probably understand. I'm very suspicious of top down attempts to fix systems, no matter how much you think you've figured it out. I like Chesterton's fence. I think investing for profit has a track record of doing immense good and that's a fine way to advance society.

Expand full comment

I think you're right that utilitarian calculations can end you in bad places. However I think this isn't really a cause for concern for EA at the moment (tho it may be in the future - tho I do reserve my concerns in this as most beliefs of most groups can be construed or constructed to bad or ineffective paths, or just have high negative externalities).

I think the "earn to give" side of EA has declined recently, in part due to realising that jobs that previously were considered "ineffective" can actually be wildly effective at crucial moments (like policy positions etc).

I do agree that EA risks bad reasoning, and bad actors, the more it expands, but that is part of the cost of expansion. The best you can do is to reduce negatives by encouraging good reflection, open communication and a general attitude of willing to understanding and take on board criticism. This is something EA excels on compared to most social, political or religious movements.

I'm not 100% sure that EA is about top down solutions, as you seem to suggest. Indeed I'm also much more pro market changes - I just think that it's generally pretty amazing that we can work out how to best effectively spend £1 or $1 to best save or improve the life of a child somewhere globally (for instance). It's pretty incredible that we give away 10% of our income, have real positive effects, and spend the rest of our time trying to effect better changes where we see low hanging fruit of improvements (be that policy, or start ups or so on). I'm unsure why giving away locally is more preferable to doing so internationally, I confess I am not a Malthusian, and do not believe overpopulation is a risk - I take a general view that it's probably good to save the lives of children we can save, and that we should do that. I'm happy with a 10% rate for charity donation to do that, and I think that that's fine practical reasoning, justified through historical practice (but refined with more solid, transparent reasoning).

You are right that there is always a risk of ill effects, but that is just saying that "we can't accurately model every outcome". This is true, but we can make good estimates (and this criticism works as much for "give locally to people you understand" as much as any other solution). I think this practice seems totally acceptable as it is, and I'm willing to bet (through my donations) that it'll have some good immediate, and likely good long-term effects. In the rest of my time, I hope to be productive and to advance society in other ways as well - which sounds like something you're also hoping to do - which is very cool!

Expand full comment

addendum: thanks for the book recommendation!

Expand full comment

The monstrous god of quakerism shows no altruism towards non-christians.

Expand full comment