So some folks who are in the Effective Altruism movement, & horrified by its recent behavior, wrote a piece about how EA can fix it self.
Reading through it, all I can think is “Ohhhh yeah. I remember this stage of trying to break out of a cult.” 🥲
Here’s the thread reader link so you don’t need twitter access, it’s a good read:
Thanks! I’ll copy it here as an extra measure.
Dr Sarah Taber (@SarahTaber_bww), 2023-01-22
So some folks who are in the Effective Altruism movement, & horrified by its recent behavior, wrote a piece about how EA can fix it self.
Reading through it, all I can think is “Ohhhh yeah. I remember this stage of trying to break out of a cult.” 🥲
Doing EA Better — EA Forum PREAMBLE It’s been a rough few months, hasn’t it? … https://forum.effectivealtruism.org/posts/54vAiSFkYszTWWWv4/doing-ea-better-1 Folks, if a group you’re in:
-Promises that only it can save you from existential threats (hell, extinction, etc)
-Has a few prominent leaders at the top who control large flows of cash & behave badly,
-Shuns you for pointing out those leaders’ bad behavior,
-That’s a cult. If you feel so disgusted by this group’s behavior that
-You feel compelled to write giant thinkpieces about Fixing It From The Inside,
-But you also write it anonymously because you know the group’s leadership WILL destroy your career over it,
-That’s a cult. And I say this will all the love & sympathy. It’s a very rare cult that actually meets the pop culture stereotype of chanting people in robes.
(I say this as someone who was born into a pop culture stereotype cult with chanting & robes- that’s weird behavior even for a cult lol) Cults aren’t limited to religion. The world is full of co’s, nonprofits, frats, military units, academic institutions, etc that use cult tactics.
If you’re in a group that makes you feel shitty & you don’t know why, the BITE model is a great resource!
Steven Hassan’s BITE Model of Authoritarian Control https://freedomofmind.com/cult-mind-control/bite-model/ “But EA & longermism make some good points about the need for long-term thinking!”
- Yes. ALL cults make a few good points. That’s how they get you! You can’t get people to sit down for a 6-course bullshit meal without good appetizers.
- Long-term thinking isn’t a new idea. It’s a normal thing for ppl to do. The main obstacle isn’t stupidity. It’s wealth inequality.
You can’t fix entrenched, multigen’l inequality w philosophical movements or charity.
You do it by taxing oligarchs & enforcing the rule of law. I just think it’s kind of funny how EA champions “Let’s just do philanthropy better!” as the solution to humanity’s woes.
Can’t do philanthropy without oligarchs to sponsor it!
So yeah the reason EA behaves like a top-down, ideological control movement
is because it is. The fact that EA props up a few pestilent, powerful gatekeepers isn’t a bug. It’s a feature.
EA behaves badly bc it wants to. You can’t fix that from within. Or at all.
There is no reforming a cult. There is only taking your sincere desires & acting on them somewhere else. And it’s painful because lots of folks got interested in EA for sincere reasons.
That’s not because you’re stupid. That’s because that’s how cults operate. They take the best parts of us [genuine concern for the future] and hijack it to serve leadership’s financial interests. Remember when the scandal broke about the MIT Media Lab taking money from Epstein?
I spoke with a lot of MIT researchers in the aftermath. Their experiences resonated a LOT with my own, breaking out of the reactionary church I was raised in. There’s no such thing as being too smart to fall for a cult.
We’re all fundamentally just monkeys who are freaked out bc we know too much [that we’re going to die someday]. That gives us totally normal existential fears that are very, very easy to exploit. (Beneath the rationalist window dressing, EA’s whole schtick is essentially identical to Christianity’s. It promises collective & personal salvation from death.
What do you think that “bajillions of consciousnesses uploaded into AI to live forever” schtick is about. Come ON.) (Few of the world’s religions are as obsessed w the individual’s fate after death as Christianity is. That’s a very Christian thing. EA’s whole moral value proposition is extremely, extremely just “hellfire & damnation Christianity rebranded for atheists.”) Just in case it isn’t clear, it is valid & accurate to be concerned about humanity’s future.
EA just ain’t the vehicle to address those concerns. It’s a remora on the shark of existential terror. As this paper points out, EA is also good at redirecting all criticism into endless philosophical debates re: utilitarianism.
That’s a key cult-leader tactic: confine conflict into arenas that leaders can control & win. It’s not hard to keep an inaccessible niche high ground. Why does EA control arguments? Again, the best way to guarantee we’re trapped in short-term decisionmaking is to have society run by a few wealthy people at the top.
In other words, the top existential threat to humanity is
billionaires.
The very people EA props up as saviors. EA is an ideology created by & for wealthy people to position themselves as the solution to humanity’s problems. When they are, in fact, the problem.
That’s why EA behaves like a cult.
The whole point of it is to control info & resources. It will NEVER change. 🫥 If you want to know more about how to actually break cycles of short-term decisionmaking in a political structure, the answer isn’t philanthropy.
Read The Dictator’s Handbook. Philanthropy is the worst possible answer to dysfunctional political systems. Hopefully this thread makes sense. I am hella caffeinated rn, & one of the key control tactics for the cult I grew up in was banning coffee to keep people tired & dependent on the org for logistical support
my liver still doesn’t know what to do with caffeine half the time lmao
Just a reminder that Twitter still doesn’t allow reading replies to a tweet, or earlier tweets in a thread, without logging in. If there’s additional context beyond the quote that is actually interesting, it’ll get missed.
are there still any good twitter mirrors that capture the whole thread, or has musky boy killed all of them? if they’re all dead, our best bet might just be to attach a couple screenshots to threads here about Twitter content
Nitter mirrors are generally back up, https://nitter.fdn.fr/SarahTaber_bww/status/1617194799261487108#m
I’ve no idea what mirroring tools still work, or how long they will keep doing so if they currently do. Maybe the simplest advice is “screenshot and/or Ctrl-C, Ctrl-V”. At this point, I trust our lovably janky little server to be more reliable than Twitter…
linked EA forum thread: https://forum.effectivealtruism.org/posts/54vAiSFkYszTWWWv4/doing-ea-better-1
For me at least, that cuts off after “hijack it to serve leadership’s financial interests”.
dammit archive.today