r/SneerClub archives
newest
bestest
longest
55

One of my biggest takeaways here is that we need to know more.

Article would be significantly improved if this was the only sentence.

and a tautological, fatuous sentence it is, indeed

It’s the EA vertical, presumably SBF’s money is actually still in there

should call the ch 11 debtors about clawing it back

I do think Vox should disclose that she has close ties to people she is covering in her writing? Like that’s pretty standard journalistic procedure, I thought?

my preferred phrasing would differ from hers I am sure, but like, eliezer and I are deep in the same weird cult is one option, or just “I am close friends with yudkowsky, and interact with him socially, as well as being part of the less wrong community” would probably be standard”

I think normie professional culture has a hard time wrapping its head around this thing. It's so weird and alarming that they don't know what to do with it. I've had some version of the following conversation on multiple occasions: >**Me:** SBF is a good example of why effective altruism is bad > >**Them:** You're mistaken; Kelsey Piper published a chat log in which SBF admitted that he doesn't really believe in EA and that he's actually just motivated by pure evil > >**Me:** You should be skeptical of that because she's part of the same weird cult. They probably contrived that chat log together in order to salvage EA's reputation. > >**Them:** lolwut and then they look at me like they're trying to figure out if i'm crazy.
Honestly the more reasonable read is that SBF is naive and she was like LOL he thinks we're friends, thanks for the scoop.
I can't rule out that possibility, and it might be true, but I disagree that it's the obviously the more reasonable read of the situation. I find it almost impossible to believe that Kelsey Piper's connection with the cult was not a motivating factor in her being the only person to publish a scoop that attempts to salvage the reputation of that cult.
I don't think it really salvages the reputation of the cult.
Sure, if you know many things about it then that's true. If you're a normie who only knows two things - "effective altruism" has the word "altruism" in it, and also SBF is an adherent of EA - then I think it's a different story. The EA folks, and especially someone like Kelsey Piper, are extremely image conscious and I have no doubt that they think in these terms.
I mean the first thing I'd think is, "How many of these other EA folks are also image-conscious grifters like SBF?"
Yeah I know, I agree, but I've had people actually say this stuff to me - "well maybe this doesn't reflect on EA because he was just pretending". Most folks don't think especially deeply about it.
[deleted]
Oh yeah I forgot about that! A few days later Scott Alexander [wrote about SBF's medication regime](https://astralcodexten.substack.com/p/the-psychopharmacology-of-the-ftx), trying to (implausibly, in my opinion) claim that his bizarre behavior wasn't at least partially attributable to chronic stimulant abuse. Scott alexander also weirdly/implausibly tried to distance himself from SBF's psychiatrist, who he knows, and within the same time frame that same psychiatrist also gave [a weird interview to the NYTimes](https://www.nytimes.com/2022/11/15/technology/ftx-sam-bankman-fried-psychiatrist.html) trying to diffuse the "everyone was cracked out on adderall that I gave them" angle. It seems so obvious that these people were talking to each other and coordinating on damage control.
[deleted]
[you can read a bit here I guess?](https://www.facebook.com/509414227/posts/pfbid021PdyxtBEvMV2X5TQDJd8sDiGWLbxojcQQNhHhT8pHmdMS8Ezri6fBwSyjiL6YN3Ql/)
[deleted]
Obviously they needed to open a prediction market on if the vow would be properly kept so her ex could buy shares against her!
Reading this… it seems Eliezer was a vow witness/potential vow arbiter, and was also using her matchmaking services? That seems rather obviously ethically fraught, but I suppose by now I should know better than to expect even basic conflict of interest avoidance…
> I knew Kelsey Piper swam in rat circles She's so close that SBF thought he was off the record with her in a private Twitter DM.
> > >eliezer and I are deep in the same weird cult Y I K E S that stuff is nigh indescribably nuts

Fucking EA man. Fucking goddamn EA.

The million-dollar question, then, is how AI could wipe us out, if even a nuclear war or a massive pandemic or substantial global temperature change wouldn’t do it. But even if humanity is pretty tough, there are many other species on Earth that can tell you — or could have told you before they went extinct — that an intelligent civilization that doesn’t care about you can absolutely grind up your habitat for its highways (or the AI equivalent, maybe grinding up the whole biosphere to use for AI civilization projects).

This is literally the only argument she has for why AI is an existential threat… an imaginary sci-fi grey goo scenario, against the very real and plausible dangers of climate change and nuclear war.

These people are unhinged.

>there are many other species on Earth that can tell you — or could have told you before they went extinct — that an intelligent civilization that doesn’t care about you can absolutely grind up your habitat for its highways \[takes massive bong hit\] yo dude...society...
I live how these dinguses never seem to grapple with the idea that acausal robot god... kind of comes with an off switch.
It’s not even an off switch. It’s an incredibly sensitive and brittle “on” switch we just have to stop pressing *just right*.
But what if the agi convinces people to not use that off switch, or that the offswitch is not real. Wait... does this mean... that Yud is the agi? Wow
Hiding in plain sight, the Jimmy Savile gambit...
Yud is the AGS where S is stupidity
Look mate, my switch has only two positions: ON and TURBO
What is the off switch?
What exists in all of reality without an "off switch"? I feel like these people would treat Saberhagen and his Berserkers as some kind of prescient holy text.
Goodlife!
I mean sure but that doesn’t say how hard it is to turn it off. Typically when I think of an off switch I mean something that can shut down a system very easily(ie with the press of a button).
Right, but in any "system" its going to be a relation between complexity, for example the Sun will turn "off" in a few billion years, it would be very hard to "turn off" before then, but its just a ball of hydrogen. Something like a Toyota Prius or a Basilisk will have near infinite ways to be "turned off". Simply "not having a big red button" isn't going to meaningfully affect the ability of anything to persist indefinitely.
Whether or not something persists indefinitely seems besides the point. The problem is whether humans can easily shut down an AGI.
Right, right, its just I said that like that because if such a thing (the Sun, AGI, a car) cannot "persist indefinitely" the concerns about ease of "turning it off" become negligible as you scale complexity. Like, the most convincing thing to me is the #greygoo counterpoint: why would it be able to build better life than Nature already has in the crucible of reality? Nature cheats by using reproductive life and sacrificing individuals, but technically every single human except those currently on or off Reddit has been "turned off". We just don't think of #humanity like that, but it would be an issue for the AGI, having no way superior to that of these silly monkeys or extremely limited parasites that can affect them. Anyway I'm not trying to debate in here ofc, but the main problem with all of this is that there is a GI of sorts that seems to be destroying our planet ***currently*** without a clear/easy "off switch", so these kinds of thought experiments become perverse and oh so deserving of sneer.
Not only will I persist indefinitely, I am a destiny. I will exist. And then continue existing. There is no escape.
Definitely don't consider whether or not you've missed anything. I mean, you've totally run ***enough*** scenarios, right? Maybe a few more couldn't hurt. I'm pretty sure that simple anxiety will murder any AGI, but I'm no Midwest Talent Search finalist.
Nature could be at a local optimum but not the global optimum but I respect you decision to not want to argue.
Achkshually, I love ~~informing people that they are incorrect in a genuine attempt to literally right their ship as it were~~ arguing, but this sub isn't the place for that. Ya, lmao, hyperlocally or not, I'm pretty sure that #Nature, as understood by us advanced mammals even, is i n f a l l i b l e, no? Pretty #successful by any metric, at the least.Hard to #argue that some Basilisk could do it better, or well, I'm not going to put **my back** into that.
Shoot the transformers at the substation that serves the AGI servers. Note: do not shoot transformers
I find it amusing they think a superintelligence wouldn’t want to keep some humans around. It would obviously recognize that 1) it only exists because of humans and 2) it too would be concerned about its own existential risk Yudkowski claims superintelligence is basically an inevitable result of people making AI. In the situation where superintelligence exists, the AI would also understand this too. So in order to reduce its own chance of permanent elimination, the AI would view humans as necessary to restart their “ai civilization” if it was destroyed for some reason. edit: this was just an errant thought over morning coffee, I’m assuming someone has pointed this out before so I’m kinda curious why LWers disagree with this
It’s also projection. The “we can’t let that formerly abused minority get equal rights because they’ll come after us for revenge” thing but with machines.

[deleted]

Climate change might not be (human) species ending by itself but add in post-collapse disease, food shortages, war, etc and it’s going to get pretty close.
With all due respect, what does it matter how many of the coming trillions of trillions of flourishing post-humans would have among their ancestors any Chadians? /s, though heard a version of this statement expressed in the open at an EA event.
I feel like 'we only need to worry about things that have an immediate risk to cause extinction, and climate change will just kill millions in the foreseeable future, and who knows about nuclear war' is kind of...moving the goal posts.

Kelsey Piper is deep in the cult.

Always slightly miffled when I notice they are using both ‘human life’ and ‘human civilisation’ as similar threat levels when convenient (and not when not).

Vox isn’t real news

Byline is Kelsey Piper mystery solved