8 Comments
User's avatar
Manuel del Rio's avatar

Loved the post. And, unironically, I am a person who can actually stake a bona fide claim to calling myself EA-Adjacent: I first discovered EA *as a consequence* of the FTX scandal (in fact, I learned the basics by perusing Caroline Ellison's old blog). Since them, I've interacted more with its theory (read Singer and MacAskill, joined CEAs The Precipice reading group and its Introductory EA program), posted (not much) in the EA forum, met some EAs in real life (few; not that many in my region) and a few more online, and got to donating 5% of my income in effective charities. I would still describe myself as Adjacent, though: I have really big intellectual qualms with Utilitarianism, have never been (or think I'll ever go) to an EA conference, am not much interested in vegetarianism and am not planning to revamp my career (I'm 40 something anyway).

I can see why EAs would have such a traumatic experience after FTX. I also remember that the months after that saw a relatively stable stream of more bad news, newspaper hit-pieces, a couple of harrassment controversies, and about a year ago, the OpenAI board drama. None of that has changed my basic appreciation of the average EA as an honest, well-intentioned, socially clumsy math and nerdy do gooder, and of the EA movement as a net positive to the world. And I share your argument that they should be really straightforward, both from ethical and from pragmatic concerns, the latter being that they are generally really bad at lying effectively (in which case, trying to act like a corporate glibmouth will be self defeating. If you want to be a liar, you'd better make sure you'll be good at it).

Expand full comment
Noah Birnbaum's avatar

Sad that this had to be written, but it did.

Expand full comment
Nathan Young's avatar

I agree that much EA comms is mediocre and they’d better off either being ruthlessly honest or savvily pragmatic.

Expand full comment
Nathan Young's avatar

I feel mixed. I don’t think the comments are accurate for the reasons you state, but i don’t think they are literally untrue.

To me they seem to miss huge amounts of context. I assume that’s on those who gave them, but i don’t know. Seems possible that amanda gave an honest quote knowing they knew she used to be married to will and they framed it in the worst way possible.

Expand full comment
Nuño Sempere's avatar

"EA" is ambiguous as to whether it's a community, a question, or a set of philosophical positions. But as the word is used, it seems to ultimately refer to a specific community and a specific set of leaders and organizations. After leadership fucked up the EA community & brand, owned by unelected leaders doesn't get to envelop a person into it because the person shares values if the person has come to not endorse the institutions, and stopped wanting to spread the memeplex (as it exists in practice, embedded in fallible human institutions). I see your argument as arguing for why the EA community can "steal the valor" of e.g., Amanda Askell, and I don't agree with it.

In this sense, EA and EA-adjacent seem similar to "Catholic" and "Christian"'; having some way to differentiate seems.

This is also complicated by Open Philanthropy essentially not wanting a term for the Moskovitzsphere, and EA filling the gap for that term, in a way that ultimately seems reasonable as well.

Compare with a political party where there are primaries. In Spain many people voted for Pedro Sánchez, and thus were causally connected to him doing various surprising and unpopular shit. Not so with EA leadership (see e.g., https://forum.effectivealtruism.org/posts/GcvEdYJADH3vMqk3F/suggest-candidates-for-cea-s-next-executive-director, where leadership was chosen by a cabal rather than by popular vote.)

Expand full comment
Matt Reardon's avatar

I don't think the community can or should steal Amanda's valor or the valor of similarly situated people. I think those people – all of whom understand both the ambiguities you've pointed to here *and* the likely-maximally-inclusive functional definition most people adopt for EA – should take the two sentences to be clearer about the facts of what they've done and said in the past. It's good for all involved. I'd be perfectly happy if they said "but fuck Matt Reardon's EA-ass in particular" at the end.

Expand full comment
Nuño Sempere's avatar

> likely-maximally-inclusive functional definition most people adopt for EA

I disagree most people adopt a maximally-inclusive functional definition for EA.

Expand full comment
Matt Reardon's avatar

Random outside journalist probably just wants to know "do you run in those circles" in my opinion and it seems like the answer is mostly yes.

Expand full comment