r/SneerClub archives
newest
bestest
longest
Former AGI acolyte describes CFAR society pretty similar to how I experienced it (https://twitter.com/QiaochuYuan/status/1353849853118304256)
42

I know very little about the guy, I remember him from being one of the top contibutors to Math.StackExchange.com and being several years into a maths PhD, until he burned it all away to join LW. He always gave me very creepy vibes, constantly posting on Facebook some RedPill-adjacent shit in the Brent Dill style, and AFAIK he’s still friends with the worst people of the Bay, WebdevMason, Aella, Quillette people, he still deeply enmeshed in the Californian Ideology. The only way I can square this is that it’s similar to a Trotskyist split: there are factions inside the community who hate each others’ guts, but from the outside they still look exactly the same.

oh shit I think I know this dude

EDIT: some of the cfar stuff is very culty in hindsight.

At the workshop, a crowd fave was ‘againstness’ where Michael would lead you thru a highly embarrassing memory, at which point he’s guide you thru a breathing exercise to calm yourself down from being ‘triggered’. You stand in front of people and then you have this kind of shared experience of being embarrassed in front of each other. Shared embarrassment = cult retention tactics.

Another activity is where everyone thinks about a difficult thing for them, then do a field trip to the mall to ‘expand the comfort zone’. Then people are encouraged to discuss their … uh flaws and embarassments.

A lot of the other activities involve ‘personal improvement’, which is essentially discussing flaws and then collecting feedback and ideas.

Other things was talks by EA ‘friends’, emphasis of x-risks, and discussing groups homes and ways of really doubling down on the ‘rats’ lifestyle.

Its a whole cult. And at that time, now like 8 years ago, a lot of it was anchored around the, uh, attractive head of CFAR at the time I felt. Although of course the nerds couldn’t be like ‘oh you’re attractive’ instead it was this silly coy game.

I did not have the time for any of that nonsense. So instead here I am totally successfully, and have not contributed to MIRI.

I'm very sensitive to personality disorders. Met Michael (assume you're talking about Michael V) briefly and found him to be a raging case of NPD. Later found out he was kind of infamous as a prominently endorsed person they later regretted - for precisely that reason. Considering his total lack of genuine care, I find the idea of going through an embarrassing memory alongside him to be projectile vomitous. I also think wanting to never die is something certain neurotypes can completely trip out on, and it's part of how they get you. Underneath the topic of race realism, they strike me more as life extension types who have expensive cryo reservations.
Yup that is the person I'm talking about. He puts off... a good sensation of really caring, although... these interactions were limited and ultimately skin deep. I think a lot of it was implied social credit from an organization you were already into, after all one doesn't attend a CFAR workshop if you don't trust them to some extent. At least trust them as smart people - which they are! But being smart isn't an excuse for everything.
I think CFAR attracts a lot of people whose social radar is particularly unable to detect fraud and personality disorders.
> > Met Michael (assume you're talking about Michael V) briefly > Yup that is the person I'm talking about. V as in Vassar or Valentine? I suspect the above comment is talking about Vassar and you're talking about Val, but could be mistaken.
I meant Vassar.
> At the workshop, a crowd fave was 'againstness' where Michael would lead you thru a highly embarrassing memory, at which point he's guide you thru a breathing exercise to calm yourself down from being 'triggered'. You stand in front of people and then you have this kind of shared experience of being embarrassed in front of each other. Shared embarrassment = cult retention tactics. Ugh. Reading this is giving me some very strong emotional reactions. I'd like to think that if someone tried to pull this on me now (in a non-clinical, non-religious setting) I'd tell them to fuck off and storm out. But there have definitely been times in my life I would have gone along with it. Nerds don't need to be taught to be rational nearly as much as they need to be taught that sometimes causing a scene is actually very good.
What nerds need is therapy. All the emotional benefits can be gained via therapy. But then I guess you won’t find a cool group house to live in where you can practice phased sleep or whatever.
I think the problem is that this cult act mimics what people think therapy is, but in a group setting. Esp with added social group pressure i can see how people who need professional therapy get goaded into CFAR group cult sessions that smell a bit like therapy. It works for scientology. E: for all the sneering we do at scott alexander at least he knows that therapy shouldnt be done by people you look up to and know and puts this into practice. But otoh, he as far as i know, never spoke out against the CFAR practice. (No suprise as he seems very ingroup conflict avoidance).
This is precisely what scientology, NXIVM, and so on do. As known from other cults, it works beyond shared embarrassment, it's used for digging up dirt on people and blackmailing them. NXIVM is particularly relevant because they also used to sell "rationality classes" and at the same cult age as CFAR is now, there were the same red flags while the evidence of criminality wasn't public yet. Bottom line is as cults like this age they get increasingly careless, but it takes a long time until the top people actually get themselves in legal trouble, and they are still in the red flags only phase.
I knew that scientology and so on did similar things, but I didn't drop it in there, and I'm glad you brought it up. While I don't think CFAR intended to be a cult... they also aren't not a cult, and also original intentions don't really matter against outcomes, now does it?
I think they researched other cults and copied cult techniques intentionally. As far as the whole fuzzy question of whether they "intended it to be a cult", cult is a bad and derogatory word and cult starters are not exactly the kind of people that are likely to think negatively of themselves. I'm sure they can read about cults, and implement what they like, with very little if any reflection on what they're doing.

We were in our PhD program around the same time! I’m glad to see this.

BTW: things were really crazy in the math department then. CFAR&co were running a highly aggressive recruiting drive and he wasn’t the only one they hooked. There but for the grace of god go I, etc.

He ended up being Creepy and really "Therapy-Adj" Online

He makes such a great point about cultlike obsession with ideology (a point that definitely applies to rationalism but also to other ideologies) in this other thread of his he links. I really feel this. https://twitter.com/QiaochuYuan/status/1271512725277958144

[deleted]

AGI = Artificial General Intelligence CFAR = Center for Applied Rationality MIRI = Machine Intelligence Research Institute

If you wanted to keep him anonymous you’d have needed to remove his Twitter handle as well as his Twitter name.

Jeez the term AGI-pilled is so…. Cringe.