r/SneerClub archives
newest
bestest
longest
Your favorite Basilisk got interviewed for this article: "Silicon Valley’s Obsession With Killer Rogue AI Helps Bury Bad Behavior" by Ellen Huet (https://www.bloomberg.com/news/features/2023-03-07/effective-altruism-s-problems-go-beyond-sam-bankman-fried)
89

Sonia Joseph, the woman who moved to the Bay Area to pursue a career in
AI, was encouraged when she was 22 to have dinner with a 40ish startup
founder in the rationalist sphere, because he had a close connection to
Peter Thiel. At dinner the man bragged that Yudkowsky had modeled a core
HPMOR
professor on him. Joseph says he also argued that it was normal for a
12-year-old girl to have sexual relationships with adult men and that
such relationships were a noble way of transferring knowledge to a
younger generation. Then, she says, he followed her home and insisted on
staying over. She says he slept on the floor of her living room and
that she felt unsafe until he left in the morning.

holy FUCK

>One rationalist man introduced her to another as “perfect ratbait” vomit

For those without a bloomberg subscription: https://archive.ph/sLihW

EDIT: finally read it. Very solid article overall. I think it is still too credulous when it comes to the scientific validity of “AI safety”, though. Things like this deserve more elaboration: > Larissa Hesketh-Rowe […] says she was never clear how someone could tell their work was making AI safer.

Like many religions, the core tenets of Rationalism include beliefs about the supernatural. It’s hard to tell if “AI safety” work is productive because it consists of diagnosing and solving problems in machines that don’t actually exist and which, depending on your definition of “superintelligent”, can not exist.

That might seem like a lesser or separate problem from things like sex abuse, but I think these could be related issues. If you’re a professional computer scientist who believes impossible things about how computers work then maybe you’re going to have other beliefs that are untethered from reality too. Respecting other people’s boundaries necessarily requires identifying and connecting with a reality that is separate from your own imagination.

It’s the same problem with EA, too - they treat everything as abstractions and in doing so they become disconnected from reality.

Thank you much! I meant to post that link too, but forgot.

this article goes in fucking hard, and it doesn’t even get to the race and IQ stuff

Ellen did a great job. I wish she'd had more space, a lot of stuff had to be cut, but it's an article I'm happy to be a part of.
if you're the person with the rad bookcase o' games, congrats on your good taste
Naw, there are no pictures of me in the article, but I do have several bookcases of excellent games.

I feel like the ending still takes Yudkowski and Bostrom’s specific AI concerns too seriously but this is probably the best one step summary of these weirdos and the specific ways they broke their brains.

I do like the idea that a lot of existing institutions are already paper clip optimizers (like I dunno capitalism).
That bit is brilliant.
It's a key point that really unravels their whole raison d'etre.

At dinner the man bragged that Yudkowsky had modeled a core HPMOR professor on him.

…is this Michael Vassar, bragging about Quirrelmort being based on him? I wonder if he didn’t realize Quirrel was (still) Voldemort and super evil, or if he knew but didn’t see why that’s a bad thing to be proud of.

Well since you said it first - thats my best guess for the unnamed guy too.
TIL that Michael Vassar does not have a wikipedia page, but does, inexplicably, have a wikiquotes page https://en.wikiquote.org/wiki/Michael_Vassar
It’s like a goldmine of sneerworthy comments! I think it deserves a post all on its own…

I really like the ending. Conceptualizing modern AI safety researchers as a human example of a poorly defined paper lip maximizer is perfect. It’s both a super clear metaphor, and one they would fully understand.

[deleted]

[deleted]
If a tree falls in the forest but you didn't hear it because you only listen to sounds that come from other Rationalists, what is the tree's epistemic status? I like how their response to an article that accuses them of myopia and insularity is to deliberately retreat into myopia and insularity.
The tree's epistemic status: complete ash
>While I am generally interested in justice around these parts, I generally buy the maxim that if the news is important, I will hear they key info in it directly from friends (this was true both for covid and for Russia-nukes stuff), and that otherwise the news media spend enough effort to do narrative-control that I'd much rather not even read the media's account of things. anyone with the barest amount of either self-awareness or media literacy would see the gigantic problems with this approach, but...
[deleted]
It's a lot easier to feel like you know something nobody else does, when you have no idea what anyone else knows or doesn't know.
> FWIW, I'm a female AI alignment researcher and I never experienced anything even remotely adjacent to sexual misconduct in this community. (To be fair, it might be because I'm not young and attractive; more likely the Bloomberg article is just extremely biased.) lmao
> the Bloomberg article is just extremely biased what does that even mean? that all women interviewed are straight up lying? Like, what's more likely, a male dominated community full of social awkward nerds being full of sexual misconduct or women just straight up making shit up? cult-like levels of cognitive dissonance going on
I don't super want to read through them myself, but I'll read whatever selections get posted here.

“[Bankman-Fried] who invested close to 00 million in related causes before dismissing effective altruism as a dodge once his business fell apart.”

Good gravy. I knew he’d invested a lot, but that is really silly money for a group that has produced close-to-nothing. No wonder they’re buying castles.

He started writing about AI in earnest in the 2000s, well after HAL 9000, Skynet and the Matrix had entered the public consciousness

This is a pretty good sneer and excellent summary of Yudkowsky’s contributions.

I think you mean “co-favorite Basilisk”.

https://fredwynne.medium.com/an-open-letter-to-vitalik-buterin-ce4681a7dbe From a rationalist

Yeah, as someone who dated Michael Vassar among others, this tracks with what I know of them.
Oof, boy things have really gone off the rails over there.
https://sinceriously.fyi/brent-dill-confessions-full-redacted/ This one too.
Damn. Dill and Ziz are both far from reliable sources so I’m skeptical…but this is nuts if true. I knew Eric a little, and it would be incredibly tragic if somebody triggered his latent schizophrenia on purpose.
It would seem I know who Liz is. :/

hats off to those responsible for the decision to release this on the eve of international women’s day (a fact no doubt lost on the ratsphere)

…and, how deliciously ironic, they keep talking about bayes and base rates of abuse, as though “holup, it’s not like our abuse is significantly worse than the rest of the world generally” is exculpatory even if true. keep digging, bros

I really hope this stuff gets followed up on.

HPMOR scared me off within a few chapters by the author’s need to keep sprinkling rape references into his fanfic based on a relatively tame middle-grade novel series. My friend who was promoting HPMOR defended it by saying that the story makes it clear that rape is BAD.

I also picked up on a subtext that good men are rational and good women think like rational men. Not that “rational people have common attributes regardless of their gender” or that “rationality is good in any gender,” but that “rationality is masculine and good.”

The Gnostic Gospel of Thomas, rejected as an authoritative text in the codified New Testament, includes this verse:

Simon Peter said to him, “Let Mary leave us, for women are not worthy of life.” Jesus said, “I myself shall lead her in order to make her male, so that she too may become a living spirit resembling you males. For every woman who will make herself male will enter the kingdom of heaven.”

I swear, Rationalists have similar assumptions about gendered cognition.

You're missing a critical point here: good women either think like rationalist men or submit to them. Otherwise you're right.
How could I forget the dovetail with tradwife culture
Roko's Harem Tradwife strikes again
Oof

Admittedly I have a hard time taking AI doomsday concerns seriously but I really can’t imagine someone or multiple someones getting wound up enough about it to have a psychotic break – that’s got to be mostly the drugs’ doing right??

If they’re only talking/thinking about doomsday all day, living together, working together, maybe not taking breaks or eating/sleeping enough, I could easily see this happening to vulnerable people even without drugs. I have a stressful tech career and I can absolutely put myself in a pretty dark place emotionally if I don’t take care of myself. When work is your hobby/passion you can get really in your head about it, and it’s even harder if your friends are your coworkers and equally passionate. That being said, they’re definitely all on drugs.
Tldr https://mobile.twitter.com/visakanv/status/1633335577012576256
In my experience on places like /r/askphilosophy, it's not super rare to find people, sometimes quite up front about being diagnosed with an anxiety disorder, who will grab onto some skeptical and/or doomsdayish speculative theory which becomes a fixation of their anxiety. Some person is fixed on the possibility of being a Boltzmann brain, another that they die every time they sleep a la teletransport paradox, etc. Sometimes fixation on such an idea is not the cause of a mental crisis but an expression of one.
The sleep guy has got to be a troll, he used to post every morning.