r/SneerClub archives
newest
bestest
longest
82

It seems to me that arguments against longtermism (and it bothers me that “caring about what happens in the future” now needs to be qualified with “but not in that way”) can be boiled down to two words: utility monster.

Most of Lesswrong just boils down to nerds discovering Ethics 101 critiques of utilitarianism and panicking
"what if utility monsters were real? well, I mean, utility monsters *will someday become real*. then it's our ethical duty to work as hard as we can right now to ~~invent~~ pre-emptively serve them"
Oh no, Roko's utility basilisk. D:
time to put on my big hat and become a utility monster hunter
Lesswrongers are the kind of mfs to learn about the repugnant conclusion and decide that their life goal is to bring it to reality
I mean we’re just so far past that event horizon, culturally, there’s no going back for common sense
There is also a certain overlap with recent supreme court take-backsie, on abortion that is. Longtermism is not about preventing some harm that future people might come to - it is not even about ensuring that they won’t get dust specks or whatever in their eyes. It is about ensuring that they exist at all, which is only the same thing if you also think that abortion and contraception are sins. In that regard, it is even worse than how Stalin justified his gulags - the existence of future generations was taken as a given then, but their quality of life was to be improved by any means necessary. Now it's all about the existence itself. It is an important enough distinction. For longtermism, well being of humans does not matter at all; the 10^50 future people will be in an utopia simply because they are in the future; all we have to care about is maximizing the probability that there's 10^50 people rather than 0 people.

[removed]

There's no type of incompetence more dangerous than the kind masquerading as expert advice. I feel like a lot of these guys wrote so much Foundation fanfiction that they became convinced they were literally Hari Seldon.
why the hell was the above comment removed?

Effective Altruism: telling rich people that getting richer is, effectively, altruism.

As always, longtermism is the kind of argument that’s easy to float if you are absolutely certain that no matter what, nobody you know or care about will die due to the issues that you think we should be ignoring today.

Also, please ignore how many effective altruists have their lives funded by the “charities” that do “research”. There is no comfortable upper middle class lifestyle behind that curtain!

Yeah, and let's only think about ourselves, and our own countries and our own political tribes, and our own races and genders....... I mean who cares about future people. What have they ever done for us??
A boring tired assertion that opposition to longtermism is tribalism and hostility towards all forms of aid and forward thinking.
Ok, fair enough. So I take it that you oppose longtermism. I'm trying to figure out why people are really so opposed to the idea of longtermism (as laid out e.g., in Will MacAskill's "what we owe the future"). What I've come up with so far from reading some Reddit comments are the following kinds of things: 1: misunderstanding of what longtermism advocates are talking about. Or 2: some kind of ad hominem attack e.g., "Sam Bankman-Fried liked that idea, and now he is disgraced, so the idea must be wrong", or "Rich people use that idea to justify being so rich, so the idea must be wrong", or "A lot of Effective Altruists think this idea is important and they are all sci loving geeks, so the idea must be wrong" etc. Or 3: some kind of straw man thing like the OP posted e.g. "longtermism advocates don't care about the suffering of living people right now" (whereas again, that is a misunderstanding of what of what longtermism advocates are talking about.... but yeah, granted, my message above was a jokey straw man kind of thing as well haha). Or 4: an argument from incredulity fallacy e.g., "longtermism advocates worry about things like Artificial General Intelligence being an existential risk for humanity and that is just crazy and/or stupid" So what is your reason for opposing longtermism? Does it fit into one of the 4 categories above? Are there more categories of opposition I should be aware of?
This is not a debate sub so I won't "prove it" but if you'd posted here when the thread was fresh you probably would have gotten some effortpost replies. Simply put the criticisms fall into 3 "longtermism is deliberately ineffective at addressing the failures of society", and 4 "AGI X-risk is a fiction and a grift".
hey thank you for responding u/sexylaboratories 🙂. yeah, I'm not really trying to debate anything, I am sincerely just interested to know why there seems to be so many people opposed to the idea of longtermism and I'm trying to figure out why. In terms of posting earlier, sorry about that, I'm really new on Reddit and still very much trying to figure out how it all works. ....I guess you'll probably just ignore me now, but I'm still puzzled how people think that "longtermism is deliberately ineffective at addressing the failures of society". Why would people like Nick Bostrum, Will MacAskill, or Toby Ord be interested in putting forward something like that? Why would anyone think that they would want to do that? I get why a lot of people might think that "AGI X-risk is a fiction and a grift", but in my view they will soon change their minds on that as the dangers of it start to show themselves more visibly, or the reasoning around the concern becomes more widely talked about (although, yeah, people have been talking about it for a long time e.g. here: [https://www.youtube.com/watch?v=8nt3edWLgIg](https://www.youtube.com/watch?v=8nt3edWLgIg)

This article about MacAskill’s book also claims that the climate scientists MacAskill contacted in his attempts to show climate change isn’t anything to worry about had never heard of him: https://thebulletin.org/2022/11/what-longtermism-gets-wrong-about-climate-change/

Pretty damning!