posted on April 10, 2023 04:04 AM by
u/saucerwizard
33
u/InferiorGood38 pointsat 1681107223.000000
When I read the bit about how “if these allegations were made public
it would hurt the cause area” I can’t help but think the cause area is
AI risk or maybe deaging. Nobody who works in global health is worried
about the cause itself being “discredited”
It’s a pedestrian observation maybe but the doomsday/savior causes
within EA/rat are the ones that most directly drive the cult
environment
I think they when they refer to hurting causes they might be specifically referring to EA work on those causes.
Like, nobody disputes the validity of global health as a cause area, but EAs believe that their specific approach to solving those problems is objectively the most correct approach. In their minds, "being part of the EA cult" and "effectively solving serious problems such as world health" are indistinguishable ideas.
They're very explicit about this: it's how they rationalize spending charitable donor money on things like buying castles or fun vacations for themselves. Revealing a culture of routine sexual assault would hurt their ability to buy castles, which would (in their minds) have a serious negative impact on humanity's ability to e.g. solve world health problems.
I think the robot apocalypse stuff is a consequence of the cult stuff; the cult environment is what allows and encourages thinking that is detached from reality.
That's a great point: ultimately the meta-cause of paying for our own cushy lifestyle requires defending regardless of professed cause
See also paying for propaganda eg Kurtzgesagt, 80,000 hours ads on Veritasium, etc
Consider my priors updated
It's a classic case of the Weinberg quote: "With or without religion, you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion." If you sincerely believe that your group is instrumental in reaching some higher goal, like saving all life on earth from destruction, this behavior starts to be perfectly justified. It's also why I think consequentialism is bad. Anyone can delude themselves that they are working towards an important cause and excuse immoral behavior based on that.
> It's also why I think consequentialism is bad. Anyone can delude themselves that they are working towards an important cause and excuse immoral behavior based on that.
Speaking as a consequentialist, isn't a better approach to say "the future is chaotic and your error bars grow with time, scale, and level of assumption, and you should apply heavy discount rates to things that involve lots of all three"?
The problem imo is that consequentialism is almost impossible to apply irl to any question whose answer is actually interesting. It's either applied to hypotheticals involving magical perfect knowledge or applied to situations that already have a pretty obvious answer.
Or, as the above commenter pointed out, used to justify immoral behavior.
Isn't the value that it allows you to reduce, in some sense, a lot of moral questions to empirical questions one might then study?
> It's either applied to hypotheticals involving magical perfect knowledge or applied to situations that already have a pretty obvious answer.
Isn't this a problem for...pretty much all ethics?
Kantianism has different problems, but takes as a starting point precisely that we can never have perfect knowledge of the consequences of our actions.
I don't think assuming perfect knowledge is a prerequisite to consequentialism, though. Just because the anglo philosophical tradition tends to root itself in the tabula rasa doesn't mean we *have* to do that.
“One survivor reported that her assailant told her he expected
submissiveness because of her race. “
With the amount of red pill jargon all of the rationalist space, this
is not at all surprising. Red pill is filled with men that promote
‘asian women are submissive sex slaves’ crap. Things like: Western women
are degenerate, find yourself a proper submissive asian wife, you can do
what you want to her.
Not saying you need red pill to be a misogynist, but you can take a person and feed them a bunch of red pill jargon that is present in the rationalist spaces that justifies misogynist views like hypergamy, iq differences, math pets, etc. Then get recommended red pill as dating advice, the jargon is familiar, and you are more likely to accept it and move from one cult to another hate cult. It normalizes it, and serves as a gateway to further radicalization.
There are a lot of autistic men in the rationalist spaces. And they seem attracted to the systematized ‘dating advice’ that red pill promotes.
Not to mention the incels that have been defended in rationalist spaces as just lonely men.
It is a numbers game, all I'm stating is it isn't surprising the find this in the rationalist community. Given the numbers of younger men in these spaces that can be sucked into this radicalization pipeline.
If you want to read up on the problems that autistic people face dating, please do. Many of them seek dating advice, and if you are taking your dating advice from the rationalists, you can get feed red pill. I'm not going to write a giant thesis on this, but here are some example links
[https://www.reddit.com/r/exredpill/comments/mjkdva/the\_red\_pill\_targets\_autistic\_people/](https://www.reddit.com/r/exredpill/comments/mjkdva/the_red_pill_targets_autistic_people/)
[https://www.reddit.com/r/aspergers/comments/n1xkms/why\_is\_it\_that\_many\_young\_men\_on\_the\_spectrum\_get/](https://www.reddit.com/r/aspergers/comments/n1xkms/why_is_it_that_many_young_men_on_the_spectrum_get/)
[https://www.youtube.com/watch?v=qt2SqmgBMEI](https://www.youtube.com/watch?v=qt2SqmgBMEI)
Also, ad hominin attacks aren't appreciated or helpful.
Recently, there’s been a spotlight on EA through the media. Not every
story/accusation I’ve received or detailed above are related to EA, but
some are. More specifically, I have stories of leaders in bay area and
London/Oxford accused of fairly egregious sexual assault and misconduct.
In February 2023, I calculated that I personally knew of/dealt with
thirty different incidents in which there was a
non-trivial chance the Centre for Effective Altruism or another
organization(s) within the EA ecosystem could potentially be legally
liable in a civil suit for sexual assault, or defamation/libel/slander
for their role/action (note: I haven’t added the stories I’ve received
post-February to this tally, and I’ve gotten several stories since that
time). Of course, without discovery, investigation, and without
consulting legal counsel, this is a guess/speculative, and I can’t say
whether they’d be liable for not with certainty without legal
advice.
Sorry having some problems with editing atm, here's the last paragraph:
>I am exercising caution around survivor confidentiality and personal
liability in this piece. Yet, I strongly believe the secrecy and hiding
of rape in these communities must end. I’ve been public with my story of
rape in the past. I’m sharing it again, not to call attention to the
person I accused, but to illustrate the patterns I’ve described in this
piece. I’m Asian-American, and my well-being was not considered when I
was assaulted. When I spoke out about my rape in 2016 and 2017 in these
communities, I was called crazy, troubled, vindictive, manipulative,
destructive, and more. The person I accused had been accused twice
before and once after. The community members that supported him tried
becoming friends with my friends, and sent messages to those friends of
mine to tell them I was crazy and advising them to stop being a friend
to me. After weeks of saying no, an untrained non-professional convinced
several people to pressure me into an unprofessional mediation. **I**
**agreed to one evening conversation, was not allowed to share my story or**
**perspective, and the mediator texted me to encourage me to commit**
**suicide the next morning. This isn’t the only story from within these**
**communities of an untrained mediator encouraging a survivor to commit**
**suicide.** This isn’t the only time an Asian-American woman was silenced
by a majority white tech-ish community. And it’ll take a long time and a
lot of work before it’s the last.
The dynamics being described are so freaking textbook subcultural
abuse that its not even funny.
Like, in any subculture around any subject you’ll have people with
social capital leveraging that power for SA and the surrounding figures
will move to protect them.
It needs hardly be remarked that if these big brain ultra utilitarian
ubermensch were as smart as all that they’d be able to observe the
familiar dynamics of abuse in their own space; the answer to why they
didn’t of course, is that they don’t give a shit.
When I read the bit about how “if these allegations were made public it would hurt the cause area” I can’t help but think the cause area is AI risk or maybe deaging. Nobody who works in global health is worried about the cause itself being “discredited”
It’s a pedestrian observation maybe but the doomsday/savior causes within EA/rat are the ones that most directly drive the cult environment
“One survivor reported that her assailant told her he expected submissiveness because of her race. “
With the amount of red pill jargon all of the rationalist space, this is not at all surprising. Red pill is filled with men that promote ‘asian women are submissive sex slaves’ crap. Things like: Western women are degenerate, find yourself a proper submissive asian wife, you can do what you want to her.
Every cult ends up being a sex cult.
The dynamics being described are so freaking textbook subcultural abuse that its not even funny.
Like, in any subculture around any subject you’ll have people with social capital leveraging that power for SA and the surrounding figures will move to protect them.
It needs hardly be remarked that if these big brain ultra utilitarian ubermensch were as smart as all that they’d be able to observe the familiar dynamics of abuse in their own space; the answer to why they didn’t of course, is that they don’t give a shit.
Well that is all pretty horrible.