r/SneerClub archives
newest
bestest
longest
Yudkowsky is still wound up about the nuke thing, and would like to reemphasize that we're all going to die (https://www.reddit.com/r/SneerClub/comments/12g9eor/yudkowsky_is_still_wound_up_about_the_nuke_thing/)
86

Yudkowsky has accepted that we’re all going to die at the hands of Terminator robots, but he will not accept being misunderstood, and so he has come to LessWrong to elaborate upon his recent Time article.

Yudkowsky’s Time article addendum 1 (kind of boring)

Yudkowsky’s Time article addendum 2 (less boring, talks about nukes and death)

For the last time: he’s not saying we should use nukes, he’s saying that we should be okay with getting nuked. Get it right.

It’s true that his logic technically follows. By virtue of basic arithmetic, half of humanity getting nuked is preferable to all of humanity getting pulped by Terminator robots. But nobody is going to get pulped by Terminator robots, so this is still the ranting of a madman and it is appropriate to mock it without belaboring the details of the matter.

However, pointing out that Yudkowsky is wrong in the comments section is verboten: the mods have chimed in in the comments to reiterate that dissent on basic matters regarding the robot apocalypse is not allowed except in designated areas.

The rationale that they give for this is that they expect people to be very familiar with the prior work on a topic before trying to engage with it.

Oops, I forgot to include this quote from Yud’s second addendum in which he explains what he thinks about how realistic his policy proposals are:

I do not expect that policy proposal to be adopted […] This is not what I expect to happen, now that we’ve been reduced to this last resort. I expect that we all die.

I wonder how long it’s going to be before someone takes him at his word and koolaids their way out of the problem for good.

> I wonder how long it's going to be before someone takes him at his word and koolaids their way out of the problem for good. I've been disturbed by this because I think it's entirely predictable, especially when combined with his "dying with dignity" rhetoric.
At first it was hard for me to imagine anyone taking him seriously, but seeing the reactions people have been having to this stuff it does seem like it's more a matter of "when" rather than "if". I'm past the point of being disturbed though. Yudkowsky is culpable in the sense that he should know better than to say things like that, but everyone else is culpable too because they should know better than to listen to him or to amplify him. Being disturbed by a mass hysteria feels as pointless as getting angry at the ocean.
> but everyone else is culpable too because they should know better than to listen to him or to amplify him. Honestly the person, or group of people, I feel most comfortable assigning direct blame to for this whole thing are journalists like Time that platform Yud and treat him as a serious intellectual. Yud himself, and his adherents, are doing a cult dynamic, and we really don't have (AIUI, I am not a psychologist) a good set of tools for preventing or resolving those. But we can damn well say that "give the cult leader a giant megaphone and treat him as a subject matter expert" is not a smart thing to do.
It is unfortunately consistent with journalistic trends of treating every crank and whack job^1 as someone who must be seriously considered and debated, instead of properly dismissed as the contemptible freaks they actually are. It’s the same instinct that resulted in 60 minutes *interviewing* Majorie Taylor Greene rather than taking the correct approach of basically saying “look at this psychotic bitch, what the fuck is wrong with her and her voters?” 1 - As long as they’re right wing.
This right here is why comedy-news shows like Stephen Colbert and Trevor Noah were so useful during the Trump years -- they didn't treat him at all like a non-narcissist serious politician like a lot of the traditional news outlets did, while still keeping people informed about the latest bs he was doing so the people could figure out next steps properly.
He sounds like Jim Jones… these folks don’t realize that they an in a techno apocalyptic cult.
> especially when combined with his "dying with dignity" rhetoric. Wait. Wait. Wasn't the *entire point* of his Harry Potter fanfiction that he found such an idea to be the ultimate in moral repugnance? That literally any alternative is better? What changed?
Have they officially reached their doomsday cult moment where the predicted apocalypse doesn't happen, and then they all make up excuses as to why, which usually include 1) We prevented the apocalypse by our faith 2) That guy wasn't a real prophet 3) The human analysis had an error, and now we have the new corrected date 4) We have passed a test of our faith and will be rewarded for it https://slate.com/technology/2011/05/apocalypse-2011-what-happens-to-a-doomsday-cult-when-the-world-doesn-t-end.html
I'm really curious how Yud will react a few decades from now when his AI apocalypse just ... doesn't happen. GPT-4 doesn't take over the world, no-one's been consumed for raw materials by diamondoid mini-terminators, things just ... keep going. Given just how invested the dude is in his crankery, how will he even react to that?
*taps forehead* Can't be tortured for all eternity if your brain state is irretrievable to the alien God
Don’t tell him that we will all die eventually…
Sounds like something a deathist would say smh

This, not because I dislike this ultimate last resort… though it is horrible… but because I don’t expect we actually have that resort. This is not what I expect to happen, now that we’ve been reduced to this last resort. I expect that we all die. That is why I tried so hard to have things not end up here.

So he’s thoroughly a failure in his fantasy world struggle. The future robot god, not Yud, has won their acasual single combat for the future of humanity. It would be an otherwise reasonable expectation that he’d finally, then, just shut the fuck about AI safety. It’s too late, close up and turn out the lights. But this is Yudkowsky. I’m sure he’s just getting started in his campaign of clout-chasing.

I mean clout chasing is preferable to him becoming the leader of a suicide cult I’ll be honest, and I fear that’s the direction he might be going in.
A suicide cult based on Pascal’s wager for nerds no less. What a time to be alive.
There’s also the worse possibility that he clout chases for now and then converts over to a suicide cult later rather than sooner, killing more in the process.
Really hard to say at this point. Typically a doomsday/suicide cult makes some kind of promise of salvation for true-believers which Yud hasn't made, at least yet. I imagine, as the worst outcome, true-believers radicalize into stochastic neo-Luddist violence focused at AI development in general, not just large GPU clusters. Closer to ELF-style "monkeywrenching" AI facilities rather than a cult.
I mean the jehovas witnesses manage, and they literally most of them can’t go to heaven cos there’s more than 144000 or whatever. Edit: And actually now that I think about it, he *kinda* has. In that if they succeed at their neoluddist terrorism campaign, they can wait for Yud to solve the allignment problem, and *then* make their AI. In that case since they served glorious Yud they get 2^32 copies of their brain cultivated and treated to infinite pleasure in order to provide acausal incentive.
JW has a very exclusive doctrine of salvation, yeah, but a doctrine of salvation nevertheless. Apart from giving Yud international influence above and beyond nuclear disarmament or climate change mitigation, which even Yud knows won't happen, there's no doctrine of salvation for anyone, at least at this point. That may change as time goes on and the prophecy grows old.
Their AI god is their salvation/reward.
> Typically a doomsday/suicide cult makes some kind of promise of salvation for true-believers which Yud hasn't made, at least yet. The promise that they're the most important people in human history has a lot of draw even without the promise of an afterlife. Besides that, it seems to me that a lot of the ethical AI + simulation stuff has implications for a lot of these people that amount to an expectation of an afterlife.

Citing Robert Heinlein as the source for your political philosophy – never a good sign.

“We’re all going to die”

Promise?

I mean given the nature of the human body it’d be extremely strange if it was false
Maybe for you. Seeing as we are expected to believe that the Terminator franchise is prophetic it stands to reason that, in a final fit of hypocrisy and hubris, Yudkowsky will cheat death by >!merging himself with the AGI that ultimately kills off the rest of us!< (Terminator Genisys)
https://m.youtube.com/watch?v=9JfnFXdkSTI
My god it fits right down to the obsession with nanotechnology.
Hey, don't insult Deus Ex like that. Unlike Yud's fictional writings on AI, that game is actually *good*. ("Electronic old men, ruling the world. A new age!" "Private citizens should have weapons to defend themselves. Take my sawed-off shotgun." "Yeah, some of my flights were in Area 51. I didn't see any aliens, but what I want to know is why they keep laying more fiber optic cable. \*The bum screams at you\* "Want to know why? It's all in the numbers. Number one: That's terror. Number two: that's terror." "A silent takedown is the most effective way to eliminate resistance. Just in case though, remember that we're police." "JC Denton, in da fresh (/in the flesh)." and the Mole People quest: "Get back! I've got a bomb! Now give me some drugs." "The city doesn't know that we have water, and they don't need to know. So go to the water valve and turn it back on. You'll need some explosives." "Want to buy some Zyme? Vials are 250." ) Part of this is that (spoilers) the techno-cult that seeks to rule the world, and controls its members with quasi-mystical indoctrination is the *enemy* faction. "Once my own augmentations are complete, and ready to merge with Helios, I will burn like the brightest star!""The Doctrine of the Mighty.It is a commonly held precept that two are stronger than one, and that four are stronger than two, and that sixteen are by far stronger than four; with this, there can be no argument. "This then is the true calling of those who would be mighty and join They Who Rule the World in Majesty, to shun all that is empty fame and glory; to eliminate weak thoughts, weak hands, and weak ideas; to give up vain individuality and instead become part of something that is glorious and strong. This is the First Secret, that by surrendering that part of you that is the least, you are elevated to the Most. The First Secret shall set you free, and those who know their duty will find in it the keys to immortality."The Second Secret can be explained to all, but truly understood only by those who have submitted to the first Body and Soul..." (/nerd-out)
You also missed the best quote: Ivan: "I SPEEEL my DREENK"
What a shame.
It's incredible, and uses the sci-fi elements of the setting to crystallize and examine some really interesting questions about society and technology and power and humanity. As opposed to Yud who doesn't want to see past "but the torment nexus!"
It also touches on some more recent phenomena outside of the LW-sphere The direct of FEMA: "This plague ... the rioting has intensified to the point where we may no longer be able to contain it." Bob Page: "Why contain it? It's cool."

Okay so all this shit… it’s basically sci-fi. Right??

Like the reasoning is on the level of the coarsest and poorest sci-fi??

Even better done sci-fi kind of makes a mockery of eacc people. Consider accelerando by stross. The conclusion is super intelligent ai civilizations become inwards focused and die a stagnant death. Basically no entity wants to be far away from the core lest they lose the latency advantage and there’s basically no advantage or purpose to explore the universe.

Ok back to big yuds dillema… he’s worried that we can’t get ai to prioritize human life. Okay. Well guess what buddy we can’t even get capitalism and humans to prioritize healthy human life. We are doing a number on the planet without needing ai.

So isn’t his criticisms really a veiled critique of capitalism? Or is that getting a bit too radical for these people??

> So isn’t his criticisms really a veiled critique of capitalism? Yes, but not in the way that you think! His criticism of capitalism is that *he's not the one with all the capital*. His Time magazine piece is completely explicit about this: his solution to the supposed problem of "AI alignment" is for him to personally dictate what other people are allowed to do with economic resources, and to kill as many people as it takes to guarantee this outcome.
Well that must be a solution that has never been tried, or has literally no downsides. Surely big yud wouldn't be suggesting something crazy like making him world dictator ....

Deviating from the approved scriptures and not demonstrating proper obeisance to the patron saints will get you kicked out of most cults.