r/SneerClub archives
newest
bestest
longest
In response to Time magazine's recent coverage of alleged sexual misconduct within the community, Aella instigates an EA struggle session (https://www.reddit.com/r/SneerClub/comments/118tlb4/in_response_to_time_magazines_recent_coverage_of/)
93

EA Forum post: People Will Sometimes Just Lie About You

You may have wondered about the origins of Aella’s previous, seemingly unprovoked Twitter diatribe against the lying liars who conspire against her. Wonder no more; she has elaborated at considerable length upon those initial thoughts in the EA forum, and it turns out that the source of her consternation is the recent Time magazine article about allegations of sexual misconduct in EA.

Come for the forum post, stay for the comments. In service to the community I have summarized Aella’s thesis so that interested readers can focus their attention on the more interesting material in the comments. Her reasoning is as follows:

  1. People have lied in the past in order to damage Aella’s reputation.
  2. The allegations of misconduct within the EA community are damaging to its reputation.
  3. Therefore approximately 80% of the allegations in the Time magazine article are untrue.

The comments section is a mixture of earnest concern for the EA community, solipsistic personal grievance, and paranoia. Eliezer Yudkowsky, Rationalism’s newly-ronin prophet of doom, makes a special guest appearance.

One woman pushes back (very gently) on Aella’s skepticism, noting especially that Time’s coverage reflects her experiences as a woman in the EA community. Her perspective is not convincing to the skeptics, and Yudkowsky suggests that if she believes there are problems within the community then she should do a thorough investigation and produce a report.

In the same response Yudkowsky also objects to steel-manning perspectives that are printed in mass media, on the grounds that mass media is the enemy and he feels personally aggrieved by it. The same sentiment is echoed elsewhere.

Another commenter offers a spirited defense of media coverage, but Aella and Yudkowsky are having none of it. Judging by the comment vote count, neither are other EA forum users. Paranoia and grievance carry the day.

More paranoia and grievance

A sexual abuse survivor thanks the other commenters for validating her feelings about the Time magazine article. She too was skeptical of it, and she had felt concerned that her skepticism indicated that she was still struggling with dysfunctional attachment styles.

Sir, this is a Wendy’s: one commenter has a hard time understanding the relevance of Aella’s personal grievances to the allegations of misconduct within the EA community.

And there’s more yet, but I’ll stop here.

Yud reveals his research methods

I would poke around online until I found an essay written by somebody who sounded careful and evenhanded and didn’t use language like journalists use

Serious “Do YoUr OwN rEsEaRcH, tHiS iSn’T a pyramid scheme boss babe 😘” energy

But also don't listen to other people who do their own research, stick to the ones who will accept my narrative at face value.
I like how they've doubled down on, "We will treat your discourse as behavior unless you behave in a way that pretty much guarantees you will come down on our side." Because anybody finding sexual misconduct to be a problem in the community is definitionally not evenhanded.
[deleted]
Meanwhile: time to steelman the literal fascists!
I only trust sources with a 1997 geocities aesthetic. "Website under construction" gifs are the beacons that will guide you to the truth.
"blue ribbon campaign", "web ring" and "burning sconce gif"
That's basically Reddit: in a large enough thread, whoever writes the longest comments with the most academic language gets upvoted. Regardless of the content.
I spent a month losing sleep over what Yud was pushing about being dead in 5 years from AI. Then I saw this comment and I feel a lot better.

Time didn’t even consider the base rate of harassment!!

these fucking people

responders: “well yeah >80% of the Time article is true, but. They were so rude about it!!”

this reply elsewhere details how dogshit the EA community response to its predators is

Scrutinizing the EA community for sexual misconduct will harm 10^27 future lives by delaying my inevitable coming.
On the plus side, I feel like I've seen less: "man it sucks that *some* of the people attached to the EA movement suck, because it's such a great idea at its core" from normie liberals who (rightfully) don't pay that close attention to them. Although that could be because nothing has made quits as big of waves in the news cycle as SBF; still in my media consumption, which includes the types of mainstream center-left publications that used to eat this shit up, there's been nothing but embarrassing news coming out about them.
Holy fucking shit!
How do you keep finding these comments
i'm cursed, *obviously*
If journalists describe your community as being full of evil shitbags when it's got 5% shitbags, maybe they have a point. Unless most places are at 10% and they're just as bad.

someone did do a real investigation, and a real writeup

then they published it in time magazine

why does yudkowski ignore that he’s responding to an investigation and ask another commentor to do one

That doesn't count because it's "the media", rather than a more reputable and trustworthy source such as a random guy's blog. One thing that remains obscure to me is what they think the media's motivation is for being evil. What does Time magazine stand to gain by fabricating accusations against effective altruism specifically? And why doesn't the mass media get any credit from them for all of the obsequious, fawning coverage that it's given EA in the past? "The media should only take someone's stated beliefs at face value when they are praising EA" - EA people, apparently.
This is the same arguably-a-cult that believes themselves to be the most important movement in human history (by a sufficiently large margin that they don't have to acknowledge any others). Of course The Establishment is trying to suppress The Truth and can't be Trusted. Listen only to the words of the Prophet, I guess.
At least with Christianity the devil has interesting motivations. I'd really like to see the Rationalists flesh out their rogues gallery better.
> "The media should only take someone's stated beliefs at face value when they are praising EA" - EA people, apparently. rationalists in general
I was laughing my ass off about this as well. That's really indicative of some bizarre cognitive dissonance, not only the rationalist community, but many such internet communities in general. They're perfectly comfortable with responding to a book-length academic treatment of some subject, whose conclusion completely contradicts their own take (that they simply fabricated out of thin air), with utterly bizarre responses like "but where's the evidence?". Well, literally in the 500 pages of what you're responding to? What do you think is in the book, and why do you think someone has brought it up? Why don't you open it? That's what people in environments outside of blogs need to do, when they want other reasonable people to adopt their thesis. They collect reasons to think that their thesis is true, then lay it out for others to see. It's like this is legitimately a foreign concept to them, so foreign that they don't recognize if when they're literally responding to someone doing it. Like the only way to substantiate some claim is to write lengthy blog posts about it.
he distrusts the outgroup because they keep calling him a fool, so he only trusts the ingroup

All of this is so fucking unhinged. Just take thirty seconds to look inside yourselves and like…apologize

how would that help them manipulate more vulnerable young people into putting out for them though
> Just take thirty seconds to look inside yourselves and like…apologize If they had this ability, would they even be a part of this community in the first place?
All I can think in response to all the f this: Goddamn these people.

The part where Elizier misattributes the ‘Gell-Mann Amnesia Effect’ to Richard Feynman, the physicist, and is corrected that, in fact, it was Micheal Crichton, the hack science fiction author, seems indicative.

Just want to clarify something:

Several of the women who spoke to TIME said that the popularity of polyamory within EA fosters an environment in which men—often men who control career opportunities–feel empowered to recruit younger women into uncomfortable sexual relationships.

This is abuse, and would be recognized as abuse in the wider polyam community (at least, the non-shithead corners of it). Not surprised to see that Rationalists are getting polyam as well as everything else they do wrong.

Another [commenter on an EA forum] said it would “pollute the epistemic environment” [to divulge concerns about sexual harassment in the movement] and argued [against] it [as it] was [therefore a] “net-negative for solving the problem.”

I love EA insider spaces, they actually say what they are thinking.

“So like, vibe check, if someone is super fucking creepy and coerces people to have sex with them up to and including supplying mind-altering substances is that acceptable behavior IF they also donate millions of dollars into pipe-dream pseudoscientific”research” into how to make the artificial intelligences that we don’t have and possibly will never work or at least not how we pretend they will not make infinite paperclips? Asking for a friend.”

The question is how young this all got.

[deleted]

[deleted]
LOL it's amazing they're criticizing them for that instead of realizing that people at Oxford are colleagues of some big *institutional* EA people and therefore have exposure to them and the institutional ability to maybe get some accountability going. Whereas some random group home in the Bay Area is exposed to nobody outside the cult and accountable to nobody, even if the median income of the inhabitants is 3X that of an Oxford professor and they are very influential in the scene.
[deleted]
Well yeah but more to the point both Macaskill and Bostrom are faculty at Oxford, it's not just about chilling in their backyard.
[deleted]
Future of Humanity Institute at Oxford, funded by some weird zillionaire, was long one of the sponsoring institutions for LessWrong and supplied its academic heft.
lol that motte thread > Tangential, but Rachel Haywire was supposed to write some expose of Silicon Valley rationalist big deal types, as she had been to their parties and such, but never did. Curious what she would have revealed if anything... that would be fun, or "fun" at least
Now thats a name I haven’t heard in quite a long time…
seems to have done very little publicly in the past few years
She took the Potato Kings v-card. 😳

[deleted]

Choose Your Own Sneerclub: 10,000 endings, all the bad ending

That Duncan Sabien fellow was friends with Brent Dill. 👀

punch bug

I scrolled through her post, and wow, the narcissism in jumping into a community’s discussion of sexual harassment to complain that someone said you were a bore at parties. It also seems unavoidable or dare I say rational that people will say and believe crazy things about you if you make yourself notorious for both throwing sex/nudity/drug parties and arguing against all taboos.

"The actual damning point he attempts to make is that I’m incapable of being interested in any topics besides myself, and I read the implication here as being that I’m thus incapable of caring about anyone else besides myself, and that this lack of care is the dangerous thing." Lol yes the average person understands why it's bad to care about no one but yourself

Isn’t Eliezer supposed to be working on alignment right now? Like… given his stated priors and the state of the world, is there any reason for him to spend time on anything else other than alignment, eating and sleeping?

alignment isn't even a problem that can be solved, it's a vague philosophical idea like qualia to be endlessly played with in a trillion different ways. the only way to learn to control an ai is to build one and see what does and does not work, but that is dirty empiricism and intrudes on the purity of pure reason so yud keeps himself deliberately ignorant of the nitty gritty of ai.
I think that his total lack of formal education means that he has malformed beliefs about what useful intellectual work looks like. That sounds elitist, but think about it: how can you tell the difference between a lunatic scribbling nonsense on a whiteboard, and a scientific expert who is making important discoveries? If you've never been to school then you probably can't tell the difference, and so even if you earnestly attempt to produce important scientific results you might accidentally end up just scribbling nonsense on whiteboards instead. I guess i take Yudkowsky at his word: i think he feels that he's done everything he can do to avert the apocalypse, and now he's burned out. The rest of us think he's just been slacking off and doing pseudointellectual masturbation for 20 years, but from his perspective it might really have felt like serious labor.
Maybe my first thought that started pulling my away from the "TESCREAL sphere" was that I couldn't reconcile why EAs - if they really believed what they say believe about alignment - wouldn't consider more options. Like why wouldn't an EA throw soup at a glass-encased painting to spread awareness about unaligned AI if unaligned AI is really really really really really much much much much worse than climate change and much much much much much more imminent? [Jaron says that the kind of machines that run these LLMs at this scale are really only possible for Google, Microsoft and some Chinese companies](https://www.youtube.com/watch?v=uZIO6GHpDd8#t=5m43s). Would you not, at least, try to talk them out of it? Even if you think it might only save a few months of human life. If you can convince Sundar, Satya and Xi, you've at least bought sometime, right? I think EAs have some cognitive dissonance, where on one level, yeah, it seems like they believe it. But then there's a revealed preferences sense where it seems like they don't. They're as concerned about stopping unaligned AI as George Lucas was about making the Star Wars prequels - they'll do the best they can do while sitting at their monitors and drinking coffee. All and all that's probably for the best.
I agree with the revealed preferences angle, but it seems like it is of limited utility because everyone is like that to some extent. Our stated beliefs are mostly approximate inferences that we make about ourselves, and whose accuracy is belied by our actual behavior. Rationalists are interesting mostly because of the way in which they are unable to agree with that picture of the world: their stories about how they make decisions are clearly just absurd post-hoc rationalizations, but their belief system almost definitionally does not allow them to see things that way. I think there's a strong element selection bias that explains the discrepancy between their beliefs about themselves and the truth of the matter. The people who can and would take effective action against the robot apocalypse are not going to believe in it in the first place, because those people are smart enough, mentally healthy enough, and have sufficiently good social skills to see that it's nonsense. The people who *don't* have those qualities are the ones who become enduring Rationalists.
> TESCREAL i just had to look this up and thought, all it needs as well is N for neoreaction and B for bitcoins

Yudkowsky and Aella have an Epstein/Ghislaine relationship. Prove me wrong

I really think there is some kind of blackmail shit going on.

Just realized im one of the people being bitched about. 👋