r/SneerClub archives
newest
bestest
longest
29

This might be a little bit unorthodox for discussions on this subreddit, but I wondered if anyone had any practical advice for encouraging someone to leave the cult and Rationalist group houses.

I presume you’re talking about somebody who’s really into it and it seems to have a detrimental effect on them rather than a well-grounded person who just thinks, “Okay, these nerds seem interesting to hang out with.” I don’t have specific informed guidance for you, but here are some minimal thoughts:

  1. Arguing somebody out of something rarely if ever works. You can’t change other people.
  2. A lot of cults work and maintain themselves by social control. Simply being there for them as a friend who is outside the cult can be helpful.
  3. If they do ask, you might want to focus on maintaining a life and social structure outside of Rationalism/EA/whatever rather than the more correct take that people like EY are crackpots. Not that I’m suggesting you lie about your opinions of EY or something, but focus on things that are objectively harmful and concerning rather than things healthy people can disagree about. I know and actually respect some people who are extremely involved in this nonsense, but they’re involved in a way that I’m not concerned for their health and safety, you know?
> Arguing somebody out of something rarely if ever works. You can't change other people. I am by no means an above average arguer, but I don't think is true. Firstly, I really don't like the 'you can't change other people' aphorism. Its only true in the purely literal sense, in that the people can never be made/forced to change if they truly don't want to themselves -- the decision will always be ultimately up to them. That doesn't mean at all that people don't change in general or often, or that when they do, it isn't often the result of the influence of others. Its absolutely possible and in fact common to exert influence on someone that then prompts them to desire to change themselves. Aside from that, while there are plenty of situations where its impossible or useless to argue with people, there are plenty of situations where its important and worthwhile. Some people's minds can be changed with argument, in some contexts. Here are some things that I think are true about the efficacy of argument: 1. Some people are temperamentally indisposed to changing their minds ever/almost ever. Its pointless to argue with these people. 2. Some people have beliefs which clearly aren't even remotely connected to 'reason.' I.e. conspiracy theories, or other unfalsifiable beliefs. Its mostly pointless to argue these beliefs, because reason won't dislodge a belief that at least some semblance of reason was not used to acquire. 3. Most people who *are* having their minds changed by argument don't in the moment admit that their mind is being changed, if they're even aware of their mind being changed at all. Most of the times that arguments are ever successful, they're not successful in a quick or visible way. When they are, they're usually successful in a protracted, slow way that isn't clear to either party. Overall, I think that rationalists are actually one of the cults that is easiest (as far as cults go) to target with effective argument. That doesn't mean most of them can be convinced. But at least aesthetically they pretend to care about reason. If they're in rationalist group houses, I mean, that's pretty rough/far gone. But I definitely wouldn't give up on actively trying to convince them (depending on the circumstances of course). If someone makes it their whole deal to always try and be as rational as possible then I would tend to think that one of things that could be most likely to save them is an argument that actually makes them reconsider some of their cult tenets. Again, at least in theory one of the tenets of the cult is to be able to 'update' your beliefs when provided with new evidence, and to try to be open to doing so. The other points 2 and 3 you make are of course very important, probably more so than any argument. People are not rational in general so I wont claim that reasoning with them will ever be more effective than being there for them as a friend or creating alternative social structures outside the cult ecosystem. I just wouldn't discount the value of trying to reason with a fundamentally reasonable person, especially given a long period of time to do so and the vantage point created by a friendship as your form of access to them
The fact that one of the tenets of the cult is that you should “update your beliefs” doesn’t actually make people who are deeply ensconced easier to convince. Rather it serves as a sort of inoculation against “incorrect” arguments made by outsiders. “We are Rational. Outsiders are deluded.” Now, I’m saying this specifically about people who are deep enough in it that one would actually need/hope to intervene. Sure, people on the fringe who don’t “believe” may be easier to convince. But that’s not who we are talking about.
I know, I want to be careful about what I'm claiming here. I don't want to claim that rationalists are easy to convince of their wrongness. There's definitely a huge element of overconfidence basically inherent in the cult, that can make argument more difficult than it otherwise might be. But I also don't think its as pointless to argue with a rationalist as it is with e.g. a QAnon guy or a flat earther. The inoculation effect is definitely real. But every cult has a mechanism designed to inoculate its followers against de-radicalization, otherwise they wouldn't be cults. For some cults, this takes the form of the fundamental cult tenet that every seemingly-convincing counterargument that could be used against the cult only just *seems* convincing, due to the existence of a conspiracy specifically designed to make those arguments seem convincing. Any argument that this conspiracy doesn't or can't exist is said to be just another arm of the conspiracy's deception. Other cults work by having the main set of beliefs be unfalsifiable in the first place i.e. that a supernatural being exists that can't be perceived in any way except through phenomena with strong alternative explanations. These beliefs are *literally* impossible to argue someone out of, assuming they've bought in to the initial premise. Rationalism is different than this. True, a fundamental tenet is that they are more rational than you i.e. smarter, more insightful, etc. This makes arguing with them *harder* and *more frustrating* than arguing with the average person because they're needlessly prejudiced against counterarguments presented by 'irrational outsiders.' There's an annoyingly high bar to getting across to them. But it doesn't make them literally immune to reason in the same way that a flatearther will claim that every argument and piece of evidence that the earth is round is only made to seem convincing by the conspiracy to cover up the truth of the earth's flatness.
> Firstly, I really don't like the 'you can't change other people' aphorism. Its only true in the purely literal sense, in that the people can never be made/forced to change if they truly don't want to themselves -- the decision will always be ultimately up to them. That doesn't mean at all that people don't change in general or often, or that when they do, it isn't often the result of the influence of others. Its absolutely possible and in fact common to exert influence on someone that then prompts them to desire to change themselves. I certainly don't mean to contradict this. What I meant is more that the following just doesn't work: "Bob is X. I want Bob to not be X. I will do Y to make Bob not X." And further: that if "Y" is "have a rational argument why X is bad", that's probably not a recipe for success. You can be there providing an alternative to X, and they may or may not be interested in it. And at some points you might argue.

people rarely change their mind because of careful, reasoned argumentation; they change their mind because they come to feel differently

it will likely take time and you must become exceptionally patient and kind

but I’m just some fuck with an internet connection. acknowledge and disregard this and ask a psychologist or someone similar

Been trying to pull my Mom out of the Q cult since 2016. Time sure flies by when you’re under constant stress.
Jfc I’m so sorry dude, that must be brutal
> people rarely change their mind because of careful, reasoned argumentation; While this may be true. People can in fact change their mind when showed reasoned argumentation. ([Via this post](https://pluralistic.net/2023/05/04/analytical-democratic-theory/#epistocratic-delusions)): > Myside" bias: Even when people strongly identify with a group, they are capable of filtering out "erroneous messages" that come from that group if they get good, contradictory evidence: > https://www.hup.harvard.edu/catalog.php?isbn=9780674237827 > Majority bias: People are capable of rejecting the consensus of majorities, when the majority view is implausible, or when the majority is small, or when the majority is not perceived as benevolent. The Asch effect is "folklore": yes, people may say that they hold a majority view when they face social sanction for rejecting it, but that doesn't mean they've changed their minds: > https://alexandercoppock.com/guess_coppock_2020.pdf I read this pluralistic article and was wondering if some of the sequences have to be rewritten. Of course, none of this will be easy. And personally I do think that a personal connection with the person you are talking to and who you are trying to get out of a cult is very important, but that is feelings (certainly helped with me when I was more into r/ssc. Not that people really tried, but it was more random remarks which made me look differently at it all).
>Myside" bias: Even when people strongly identify with a group, they are capable of filtering out "erroneous messages" that come from that group if they get good, contradictory evidence As seen by how many Catholics are self-claimed devout Catholic but the pope is definitely wrong about XYZ

Thanks for all of your responses. Yeah the person involved is very bright but their involvement moved away from “these nerds are smart but delusional”, to “these are the only people who understand me”. I’m also concerned about the constant and intense emotional toll their involvement in group therapeutic practices is leaving.

I intend to just be a breath of fresh air to them, but it’s difficult when Rationality convinces you that personal narcissism is in fact objective scientific truth.

Getting your friend into reading anti scientology stories might help. There was a lw post on joining scientology courses which had some good comments from dgerard on this subject under them. Will not get them out of the cult, but might open their minds a little bit on cult practices and how irresponsible (naive??) lw can be about this stuff. https://www.lesswrong.com/posts/qwdupkFd6kmeZHYXy/build-small-skills-in-the-right-order
Some of the arguments from the rationalists are interesting but why the fuck would anyone want to hang out with any of them in person , they are grim to look at and be around

Steven Hassan is a leading expert on cults and the developer of the BITE model of high-control organizations: https://freedomofmind.com/

[deleted]
that was ah, an unforced error, yes

The issue with rationalist ideology is that it places intellectual understanding above any other type of understanding, and specifically places your own intellectual understanding equal to the experiential understanding of the rest of the world.

It’s like reading every book about driving theory and deluding yourself into thinking that gives you the same competence as those with experience of driving. 90% of actual driving consisting of the understanding that people are crazy and all the ways people are crazy in, but you are not going to learn that from any theory book, you need to experience it.

Once you understand that most of the things in the world requires experiencing them to fully understand them, the bubble pops. Unfortunately that would require rationalists to spend one second of their life with their heads outside their own asses, so that’s a tough sell.

If what they say about the group sex and polyamory is true, your job may be more difficult than it seems. Not joking btw, it’s hard to leave a community that basically “solves” all of your “needs” for one that probably won’t stack up

There must be nonrationalist polyamorists though

So the trick with deradicalisation is that the person you’re trying to get out, has replaced their entire social and emotional support network with cult members, and you have to find a way to replace that network with healthier options.

Theres no magic bullet. Its hard and takes a long time and a ton of emotional labour.

Well I'm in it for the long haul

David Neiwert’s Red Pill, Blue Pill isn’t bad. It’s about right wing extremism instead of cults, because that’s his field, but it’s applicable. Be warned that a lot of it is about the Vegas shooter as opposed to techniques but he gets there eventually.

Trusting the opinions of strangers on the internet over people who are professionally trained and licensed on matters like psychology is probably a place to start.

Or ask ChatGPT.

I was never actually a Rationalist, but when ChatGPT3 became publically available, I decided AGI is almost certainly sci-fi nonsense.

I doubt that real world evidence would easily sway an actual Rationalist motivated to believe in the dangers of robo gods.

There’s some deep emotional need that being in the group houses and the cult that is being filled, some sort of sense of belonging they can’t get on the outside. Maybe it’s a sense of purpose, maybe they feel like they’ve found the most ethical thing to do with their lives, maybe they’re too socially maladjusted for anything else–you’d know best, being their friend. For them to break out, either the cult will need to stop fulfilling that need for them, or they find another way to fulfill it. I think the only thing you can do is keep being their friend, and also keep trying to get them to do things with people who aren’t in the cult, while perhaps gently questioning the tenets of the cult that most appeal to them.