r/SneerClub archives
newest
bestest
longest
The "EA Forum content might be *declining* in quality" because: (https://pbs.twimg.com/media/Fc4ik37XgAA4FZ7?format=png&name=900x900)
55

Pretty obvious what’s going on.
EA is the rationalists most successful attempt at selling themselves to progressive people.
Their Reddit is experiencing an influx of left-leaning people since McAskill released his book and did his press tour.
Effective Altruism is now shocked that there are people asking for antiglobalist politics, worker’s rights, and abortion rights.
Those new people, you know, thought it was about helping people, and actually believed that.

(I’m also laughing with this MacAskill hagiography https://www.newyorker.com/magazine/2022/08/15/the-reluctant-prophet-of-effective-altruism where he claims he was super-duper-left wing and and ecologist, but he also liked emissions trading).

It's also about having new people in that don't use obscure jargon and haven't spent the last decade seeped in lesswrong posts. This is ironic, because one of the biggest sources of error in EA is all the dodgy assumptions and customs they smuggled in unexamined from the rationalist community.
It'd be cool if EA could be entirely co-opted by new users in that direction. I'm a college student and everyone I talk to about EA or from the EA club has that sort of vibe (not rationalist).
Idk, that direction sounds like effective altruism without any of the effective altruism
So a straight improvement, then.
So, altruism that is effective in something other than the name?
Not sure this is true tbh, didnt they have stats that the average EA person was turning more and more to 'we must stop AGI' and other weird projects, and that the support for stopping poverty was on the same level (still one of the highest). I recall a prev sneerpost linking to internal EA research about that (which iirc only went up till 2020 btw). If more progressives joined, you would expect poverty etc worries to go up. [And thomas kwa's post](https://forum.effectivealtruism.org/posts/7mTTzXutkgkzJuM3e/thomas-kwa-s-shortform?commentId=ALEFMbGpaWrYKCvBq) while being prime funny content, actually contains no facts and is just feelings, about how the new college kids are wrong and aimless. It gets even funnier, he is complaining about the bad takes of people who have only been on the forum for less than a year. His accounts creation date? 2020. And he has written posts like 'the case for infant outreach' and 'How dath ilan coordinates around solving AI alignment'.
Heres the [2020 survey](https://forum.effectivealtruism.org/posts/83tEL2sHDTiWR6nwo/ea-survey-2020-cause-prioritization) that was linked last time. The key part is the engagement vs cause graph. The super active members are getting aboard the AI train at an increasing rate, while the casual members are still in the global poverty camp, presumably only popping in every now and then to check up on what givewell recommends or whatever. The recent publicity around EA seems to be coming from MCaskills book and it's associated news, it'll be interesting to see where the newcomers end up.
Thanks for looking up the link! And yeah, it will be interesting, wonder if they will continue the surveys. (Esp with people like [Scott Alex pushing](https://astralcodexten.substack.com/p/criticism-of-criticism-of-criticism) for 'We EA are actually way to critical of ourselves, we should look inward less).
I dunno. It was started by rationalists, so to me, from the outside looking in, the move towards longtermism doesn't look like an actual change, rather the implicit agenda coming out. And the foreign aid stuff is still focused on neoliberal technocratic principles, cost-effectiveness, cutting overhead, laying off employees and flooding the markets with vitamin A pills, malarianets, school textbooks, deworming pills or what have you. On top of that these guys like Scott Alexander who thought Congo under the Belgian government was kind of okay. Like, I argue that EA is just LessWrong a lot, and I get these counterarguments... But they look like re-affirmation to me. Maybe that's my problem tho, idk what would convince of the opposite. >It gets even funnier, he is complaining about the bad takes of people who have only been on the forum for less than a year. His accounts creation date? 2020. > >And he has written posts like 'the case for infant outreach' and 'How dath ilan coordinates around solving AI alignment'. Okay that is hilarious "If more progressives joined, you would expect poverty etc worries to go up." And okay that is very interesting

EA Forum is turning into a place primarily optimized for people to feel welcome and talk about EA…

The horror.

[deleted]

Every forum that continues to grow will hit its Eternal September eventually.
Effective September.

If they ban discussions of EA on the EA forum, I may have to join just to be in the one place where I know they will not be talking about it

“The economy of takes.”

we're losing our competitive edge on takes to the chinese
Mr. President, we must not allow a hot takes gap!
I just shorted the EA take market. When they tank I'll have massive intelectual profit.
I find it endlessly amusing that they’re now unironically using nomenclature that Left Twitter would have made up to dunk on them
I do like to refer to the general atmosphere on Twitter as the "hot take industrial complex"

As seen on Emile Torres’s Twitter here

2 astronauts.jpg

Saw this on twitter and came here immediately lol