r/SneerClub archives
newest
bestest
longest
71

This lesswrong post was made 17 days ago, but I don’t think anyone’s dunked on it yet so here goes. Yud starts this post off with a bold claim. A VERY bold claim.

https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy

“ It’s obvious at this point that humanity isn’t going to solve the alignment problem, or even try very hard, or even go out with much of a fight.  Since survival is unattainable, we should shift the focus of our efforts to helping humanity die with with slightly more dignity.”

Oh ok, I guess survival is just unattainable. We all trust Yud on this right?

He spends the rest of the post essentially convincing you why humanity is doomed and you should orient your life around this fate

That’s why I would suggest reframing the problem - especially on an emotional level - to helping humanity die with dignity, or rather, since even this goal is realistically unattainable at this point, die with slightly more dignity than would otherwise be counterfactually obtained.

Here he is telling you you are wrong if you think survival is possible, so just try and make your life more dignified so that when you die, Yudkowsky will respect you for the way in which you died.

So don’t get your heart set on that “not die at all” business.  Don’t invest all your emotion in a reward you probably won’t get.  Focus on dying with dignity - that is something you can actually obtain, even in this situation.  After all, if you help humanity die with even one more dignity point, you yourself die with one hundred dignity points!

Ultimately, this is all just a way of saying MIRI was wrong, their contributions aren’t valuable to the world, and it will probably soon be shutting down or something. But that would be an admission of failure, so instead let’s just turn it into us being right somehow.

This seemed a bit ridiculous even for Yud, so I checked the date, and sure enough, it’s an April Fools bit. Well . . . kinda?

Q6: Hey, this was posted on April 1st. All of this is just an April Fool’s joke, right?

A: Why, of course! Or rather, it’s a preview of what might be needful to say later, if matters really do get that desperate. You don’t want to drop that on people suddenly and with no warning.

God, he thinks he’s so clever, doesn’t he?

Q6: Wait, now I’m confused. How do I decide which mental world to live in?

A: By figuring out what is true, and by allowing no other considerations than that to enter; that’s dignity.

Q6: But that doesn’t directly answer the question of which world I’m supposed to mentally live in! Can’t somebody just tell me that?

A: Well, conditional on you wanting somebody to tell you that, I’d remind you that many EAs hold that it is very epistemically unvirtuous to just believe what one person tells you, and not weight their opinion and mix it with the weighted opinions of others?

No, Eliezer, I don’t care which world I should be mentally living in, the more pressing question at this point is whether you actually believe all of the shit that you just wrote.

Like, you just wrote several thousand words about a topic that many people seem to think you are an expert in, and your conclusion is that wanting to know what your true intentions and beliefs are is asking for too much.

It’s actually very epistemically unvirtuous of you to take anything that Eliezer says too seriously or expect him to explain what the fuck he is talking about. The veil of vagueness, condescension and mysticism is for your own good.

If he doesn't believe it, then this is incredibly irresponsible. A lot of people in his audience likely believe this doomerism, and now he's making a long argument in favour of AI doomerism, with no rebuttal against it?
buy now! while supplies last!
idk it seems wild to me that rationalists with all their reactionary politics don't understand the concept of metairony or postsincerity where you can say things you believe but make it unclear its a joke

[deleted]

[deleted]
Maybe it's just my brain broken by the last decade-ish of stupid and/or horrible shit that was all too real but it feels like the gap between the absurd and the actual has never been narrower, such that some April Fools 'pranks' feel like a flinch test by the wealthy buffoon class. And it seems that more companies have been leaning into that ambiguity to test balloon and generate buzz for some product or collaboration. Which is to say that Yud isn't alone in that 'maybe not maybe April Fools' horseshit and I fucking hate it.
This is exactly my reaction to the whole thing

[deleted]

im kinda leaning towards the second possible that Yud was/is having an existential crisis about the future of MIRI and this was his way of doomerposting with plausible deniability
Oh I wouldn’t say they’re at all exclusive

[deleted]

[deleted]

I see that, in his infinite cognitive eminence, Yud has discovered a novel horizon of human experience which he understands to be “depression”.

Wow so he is doing ironic non irony things now, urgh it was dumb when 4chan neonazis did it, really dumb when tim pool/post rats/weird sun do it, and this is just sad.

I could start a rant here on how being epistemologically vague on your true beliefs is bad Rationality (esp if you worry a acausal actor is learning from your words) but why bother.

E: yud might want to get some help for his transhumanist depression however. Sadly for those who want to live forever it isn’t going to be likely soon.

Poor Eliezer, still clinging to ambiguity and irony, while more evolved intelligences such as myself have realized that we can just lie.
I cannot lie.

Turns out Yud has good reasons to be depressed regarding AI safety. Check out this tweet about regular machine learning AI development. (The implication here being that if an AGI is developed with safety, due to capitalism, other companies will try to catch up and not implement the safety mechanisms. Which means that if the first AGI doesn’t go foom (or the safety mechanisms slow the ability to go foom or make it impossible) the unsafe AGI will).

Anyway, late reaction which I thought might be interesting to document here.

When I read this, I felt afraid. For him, for his followers - this is some deeply-depressed, angry, loathing-the-self-loathing shit. This is not a good place for any human being to be in, and I hope he has people around to help him snap out of it and not hurt the other fucking people who hang on his every word. I think of someone I know in the UK who was considering killing her cats so they wouldn’t have to survive a Russian bombing, she was so certain of the threat - if the rationalist community takes this rant seriously, there are going to be a lot of deeply, deeply unhappy people who will want to say FUCK IT and exercise even less empathy or scruple because they were DENIED THE RIGHT - their inborn right, as superior, clever people - TO SAVE THE WORLD. Ugh.

That’s funny, isn’t Yud one of the guys who wants to live forever?

There’s literally an april fools day tag on the post, which was posted on april fools day… looks like someone took the bait lmao

[deleted]
Valid 👑

Publication date: 1st Apr 2022.

It’s even tagged “April Fool’s”.

[deleted]
This is what happens when you take "funny" and run it through a "make everything absurdly verbose" filter. But...uh...which Astral Codex post? If Scott didn't realize this was (super obvious) satire, that'd be pretty sneerworthy.
To be fair, Yud has Poe's Law working against him.

> this is all just a way of saying MIRI was wrong

He’s saying that humanity has failed to grasp its situation and respond appropriately, and that in the near future (less than thirty years) a superintelligence with a goal as inane and humanly unfriendly as paperclip maximization will hatch, and then take over our corner of the universe.

Good post, but I think we got the gist of what he’s saying without your (nonetheless apparently accurate) editorial exposition

[deleted]
I can’t decide if it’s an April Fool, nice catch though
[deleted]
Yeah Greater Yud doesn't exactly poke fun at himself or think his very serious adult beliefs are anything to be made fun of.
> I can’t decide if it’s an April Fool, nice catch though I mean, is it any stupider than anything else he writes? You have to have a baseline of sense for the irony of April Fools to be effective.
It’s more that it’s out of character in a way I can’t quite pin down, he normally reserves this tone for talking down to people
He also tagged it "April Fool's" and published it on April 1, which seems like a dead giveaway.
There’s some stuff about that in the piece, and in the comments here and there

This is called trying to galvanize people

It is funny. This death cult idea would actually be the best way to fight AI.

If agi comes to pass and sees we will cooperate in the prisioners dilema it should chose to cooperate as well. That’s that big game theory they talk about.

If agi comes to pass and decides to defext but we are all just nice to eachother and don’t do what is says then it is essentially de-fanged. The maximizer asks us to go to work at the paperclip facotry all we have to say is no and that probelm is solved.

Interesting how this kind of nihilistic philosophy encourages not giving a shit or not doing anything at all to make a better place. Makes it easy to see why tech bro CEO ’s would buy into it