I’m a high school senior about to decide what direction I want to take my career, and I recently discovered Effective Altruism and 80,000 hours. It’s actually been pretty helpful, and I like the idea of going into a field where I could maximize the positive impact I could have. The EA movement seemed great at calling out the hypocrisy of spending huge amounts of effort on certain issues that are salient but not as important, while ignoring issues that are important but far away.
I had no idea that EA was related to LessWrong and the rationalist stuff. I’m not too terminally online, but I know a bit about the rationalist community (Rokos Basilisk, racism, the like). I’m honestly kinda disappointed.
I’m a lefty, so I always thought that the billionaire-worshipping and lack of criticism of capitalism from people in the EA movement was odd. It’s also really confusing to me that a movement based on not discriminating impacts and helping the developing world could have such an overlap with race-realists and tradcons.
Basically, what the fuck?
Edit: also there’s this lol
[deleted]
I would say that despite the prevalence of bad ideas, effective altruism is by far the least bad thing we cover here. I would say it does quite a lot of good by directing more funds to third world poverty causes in ways that tend to avoid the counter-productive elements of other charities. Unfortunately, thanks to the rationalists, it also results in a ton of money being thrown into glorified tech think tanks due to overblown AI fears powered by group think and a chain of dodgy assumptions. I feel like these people would have blown their money on something else anyway, though.
I don’t believe there’s anyone out there who disagrees with the notion that we should be effective in our how we use our available resources to benefits others. The worst of it is how much effort is put into galaxy brain takes which lead so-called effective altruists to conclusions which are anything but the stated goal of the movement, such that mosquito nets to fight malaria lose out to quasi-science fiction doomsaying of ever-imminent rise of a malevolent AI god. I can’t say if the worst of it is a ‘loud fringe’ sort of situation but what I do come across in my limited slice of the internet is more often reflections on hypotheticals of existential risk and future persons than development of actionable solutions to concrete harms for living persons.
I have no particular view on EA, but tangentially, this is how I would have gotten the most out of college if I could do it over again:
People fucking love to give college students advice, and most of it is either received wisdom or has more to do with the advice-giver themselves. You’re matriculating at a particularly strange moment in history, and the received wisdom will be even less relevant than usual. Take all advice that people give you about Your College Career (including mine) with a massive grain of salt. It’s your life and you’re the one who has to live in it.
Continue to avoid becoming Terminally Online.
College might be the last time (unless you go to graduate school) that you have easy access to a bunch of professors who kind of have to talk to you. Develop relationships with the ones who are working in areas that you might want to go into yourself.
Leave yourself open to discovering new things about yourself: you might run into unexpected challenges in areas that previously came “naturally” to you, or find a knack for things you thought you weren’t good at or had no prior interest in.
Most of the skills that have helped me in my working life (basic logical reasoning, public speaking, writing, editing, harder-to-summarize social skills) were developed as a byproduct of my high school and undergraduate academic work (and job experience, and just generally Being Alive); the content of that work almost wasn’t that important, in retrospect. Hone your self-presentation and the way you communicate your ideas, both in writing and in speech.
This is the one I’m having the hardest time articulating concisely, but I’ll try to sum it up as “find a weird niche”. Look at the people around you who seem very certain of their future paths in life (particularly if those are pretty traditional/accepted paths: med school, law school, management consulting, etc.) and try to distinguish yourself from your own “herd” in some way. Groups like “pre-law” will tend to have certain gaps in their knowledge and skill set, on average. You can get a lot of mileage out of being “the pre-law person who also knows a lot about data visualization” or “the poli sci major with an intimate knowledge of agricultural tech” or whatever. Knowing another language can be extremely helpful and is more difficult to pick up later.
Other social spheres are essential: e.g. “Model UN nerd who only knows other Model UN nerds” is not a life I would wish upon anyone. Spend time getting to know people outside your major and your main extracurriculars. If your ultimate goal is to work at a nonprofit, it’ll help to have plenty of experience talking to different types of people, understanding their values and needs, and getting them to like and trust you.
I absolutely did not follow this advice at your age but don’t drink too much, don’t do too many drugs, and get a good night’s sleep as often as you can.
This is a problem I’ve been struggling with as well. I think there’s enough good stuff in EA that it’s worth participating, but being open about your reservations with some of the associated fringe weirdness and trying to shift the center of gravity towards global poverty/animal rights and away from AI Woo etc. Also, try to bring in squishier factors when people start doing utilitarian calculations: “Will that course of action change you as a person¹ ? Would it work if we all did that? Where do you think the money ultimately comes from?” You don’t actually have to argue for virtue ethics or deontology or anything, because those factors can all be considered in utilitarian terms as long as you’re not afraid of considering ideas that without numbers.
The kids are gonna be alright, imho.
Is effective altruism that shit Peter Singer (?) was on about back in the day? Not to malign the guy because it may not have been him. Where it’s like:
If that’s the one, I don’t find it as disgusting as certain rationalist undercurrents. Think neoliberalism vs. fascism.
How Effective Altruism works in practice:
yeah, sounds obvious and sensible
this is pretty sound reasoning actually, I can get behind this
wait what
what the
fuckin
“Effective Altruism is mostly about giving money to Yudkowsky” “Yes but we can’t actually say that out loud”
If you’ve already decided you want to be altruistic, effective altruism seems like an effective way of going about it, as long as you steer clear of the kind of people who are naturally attracted to an ethical system that values the lack of normal human empathy.
But planning your entire career around it, i.e. making as much money as possible so you can donate more, seems like a bridge too far. I think that whole branch of EA exists mainly to make overpaid brogrammers retrospectively feel like they’re saving the world by becoming filthy rich, the latter part of which they were already trying to do anyway, now that their day jobs no longer give them that feeling. Frankly I think the higher you try to raise your income, the less possible it is to do so ethically, and you wouldn’t e.g. rob an orphanage in Switzerland to give 10% of the take to orphanages in Burundi.
Follow your passions and skills, not just lucre, and donate whatever you can spare plus a kidney.
FWIW I think it’s fairer to say EA came from Peter Singer, who is at least an actual philosopher, and the ratsphere merely rediscovered and colonized it. I suspect he doesn’t know them and wouldn’t like them. Of course there are things about him we can sneer at too.
ascended braingod answer: utilitarianism is the infernal logic of capitalism and anything that preserves utilitarianism justifies capitalism so yes
meme answer: yes yud sux lol
actual non-nuanced answer: just don’t be racist and ur fine
nuanced answer: ur fine, but it’s worth considering why/how an ostensibly selfless, charitable, altruistic movement breeds so many repugnant ideas. is it just how the movement’s managed? are you just seeing a gross loud very online contingent? or does EA somehow justify/lead to those ideas? possibly, possibly not. it’s hard to argue against the immediate benefits of unconditional cash transfers. but again… it’s genuinely really hard to say and you should think about it before you get in too deep.
Here’s an idea, if you haven’t heard this one already. Once you graduate from college, give this a try for a few months: https://labornotes.org/2014/02/organizers-worth-their-salt
I think one of the most valuable skills a lefty can develop is the ability to talk to strangers across class and race lines. And it’s important to get out of your bubble as much as possible. Plus, you’ll get the satisfaction of directly attacking a structural problem.
In large, it is a way that bourgeoise individuals can soothe their egos concerning their decisions about prioritizing the accumulation of wealth over the personal sacrifice in the cause of social change.
The real problem is there is no valid metric of morality or good. This is problematic as Singer wants to base Effective Altruism on Kantian ethics. In contrast, Kant asserts ethical decisions are purely binary.
The fact Effective Altruism lacks any structural or systematic criticism should be a big red flag. I’m trying to say that donating 10-25% of your income to charitable causes doesn’t make you a saint. Simultaneously, there are more ways to do good than organizing a protracted people’s war to foment revolution.
You don’t need a rational framework to be a decent moral person.
Here’s an exercise for the would-be effective altruist.
Read this essay by William MacAskill, one of the founders and big names of EA, purporting to debunk criticisms of Mark Zuckerberg’s philanthropy LLC.
Read the links MacAskill purports to debunk, and compare them to what he says in his essay. Ask yourself: What are the main points of each of these links? How do they compare to MacAskill’s summaries? If you are looking to engage with quality arguments, why would you look at a random 4 sentence post on the subreddit r/self? What reason might MacAskill have to use shoddy argumentation to talk up what a good person Zuckerberg is?
Read some news articles about what Zuckerberg and Facebook have been up to in the half-decade since, and compare to what MacAskill says about Zuckerberg in this essay.
[deleted]
I have nothing constructive to add or advice to give, but I want to ask what your opinion is of this
Specifically the section about the hedonic rat farm
The EA stuff and the stuff talked about here is pretty far apart imho, sure there are some cray folks (every org has them) and the miri (or was that renamed recently) is a bit weird, but there are no EA people advocating for the 14 words so we can save the planet as far as I know.
Extreme EA is silly or weird. Extreme ‘rationalism’ (im assuming rationalists would deny this is part of it) leads to neo-nazis, HBD, NRx, PUAs etc.
The ideas behind EA arent that bad, even if they arent that effective.
E: and dont forget sneerclub is basically about the online presence of all these things, offline it is prob all different, and offline determining if somebody is actually racist/sexist, or just made a stupid remark and to see if people actually understand what you mean (via voice tone/body reactions etc) is a lot easier. Esp as people tend to not use a lot of emotions in serious text :shrug emoticon: (interactions between non-neurotypicals and neurotypicals makes this a little bit harder of course, it also takes up more time, and is more one on one).
EA is a chimera, just one more fruitless attempt to quantify the unquantifiable. It would be harmlessly stupid (albeit annoyingly presumptuous—as if they were the first people to ever think of applying reason and evidence to altruism!) if it were confined to the clique of navel-gazing philosophers who launched it in what one suspects was a half-baked effort to morally justify their own lives of cushy irrelevance (“What if the most important work of all were puzzling out what the most important work is?!”).
Instead, it’s become downright pernicious thanks to the efforts of cretins like Yudkowsky and Bostrom, who’ve recalibrated that self-justifying navel-gazing to their puerile interests, come up with “The most important way for humanity to spend its resources is a preemptive strike against Skynet,” and actually managed to persuade a bunch of gormless zillionaires to cough up millions for that notional purpose. (And meanwhile those selfsame zillionaires move heaven and earth to shelter the far greater remainders of their fortunes from taxes, and Yudkowsky et al. just go on whistling Dixie, and the tide rises, and the rainforests burn…)
[deleted]
EAs tend to most-often be moderate left-wing, except for the fact that they love billionaires.
You have an overlap between EA and LessWrong, and an overlap between LessWrong and SSC, and an overlap between SSC and race realists, but in the end it really doesn’t add up to an overlap between EA and race realists.
If you engage with the EA community you will run into people willing to entertain Rokos Basilisk. You will most likely not find any endorsement of racism. You will find silicon valley libertarians telling you that the most effective thing to do is to develop new cryptocurrencies. You will most likely not find any tradcons.
RemindMe! 2 days
ea is a career? how does that work?