r/SneerClub archives
newest
bestest
longest
I need some help to clean my mind from the grip of rationalism [NSFW] (https://www.reddit.com/r/SneerClub/comments/sf3i8s/i_need_some_help_to_clean_my_mind_from_the_grip/)
66

Hello. I’ve been checking the sub for quite a while now, and decided that i could use some help from you. Made this account just to post this.

I’m a former rationalist dealing with the leftovers of this ideology that still occupy my mind in a very damaging way. I became attracted to this whole thing some years ago and for a long time it captivated me. This was until i started to notice the abusive cultish (and of course, bigoted) tendencies of the whole sphere, tendencies that have been described in detail by other ex rationalists in this same sub.

I have some mental health issues that (among other things) cause me to experience an extremely tiresome amount of rumination and intrusive thoughts. And as you can guess, rationalist ideas and rumination are not a good combination.

Even though i left rationalism for good the ideas i learned during my time in those groups still cause me anxiety and general mental discomfort. The possibility of being a simulation, the infohazards, the implications of the MWI and a multiverse, etc. all still manage to show up in my head and prevent me from having a healthy relationship with my psychological issues.

But perhaps the most damaging thing is the fact that the feelings of inferiority fostered by the ideology are hard to get over. I think it’s the most damaging part because it’s the reason why the other ideas are taken seriously at all. As other people have pointed out, the sequences build a case for disregarding scientists and experts in different fields, and instead put your trust in the power of Bayes as an entire epistemology. After all it’s hard to argue with math, right?

I think an aspect that has been a bit ignored it’s not the things the rationalist leaders say, but instead the things they don’t say. Yudkowsky claims to know several “secrets of the universe”, and if you mix this with the devotion to Bayes he builds in you, you get an strange aura around him and other rationalists, a feel that they know things you don’t. And they used math to get that knowledge.

I know this post is a bit long and lacks specific questions, but i feel like i needed to vent these feelings a bit. But like the title says, what advice would you give in order to “break the spell” of rationalist ideas and leaders? I’m already seeing a psychologist for my general mental health issues, but i think it’s helpful to discuss this with people familiar with the whole sphere.

EDIT: Thanks to everyone who took the time to reply and offer some help, it means a lot to me. I will check out the stuff you recommended, it can be a meaningful contribution to my overall recovery from these experiences.

As other people have pointed out, the sequences build a case for disregarding scientists and experts in different fields, and instead put your trust in the power of Bayes as an entire epistemology. After all it’s hard to argue with math, right?

Funny, because when I calculate P(H|E) = P(E|H) * P(H) / P(E), with:

  • H = person is right
  • E = person has been credentialed via a formal, rigorous process of study, examination, and collaboration with other credentialed people

or with

  • H = I am suffering from Dunning Kruger syndrome
  • E = I am a layman with very strong opinions about a subject

I end up coming out of such calculations with more trust in experts and less trust in myself. The amount of evidence I need to accumulate in order to justify belief in the hypothesis that “all the experts are wrong” is immense, and rightly so.

The thing rationalists miss out on is that Bayes’s theorem is only as good as the numbers you put into it. Garbage in, garbage out. And so the confidence that rationalists have in their own rationality, by doing very precise calculations using very unexamined assumptions, is the source of all comedic value that fuels this sub.

As people in the machine learning community say, Garbage In Garbage Out. The math is fine. The assumptions they blindly feed into the equations are the problem.
Yep — though always appealing to the same math/method is sus as well even assuming quality data. Also from ML, ensemble methods tend to outperform single methods by a wide margin for complex problems.
This argument reminds me of Folding Ideas recent video about NFTs and blockchains. How people are so goddamn convinced that these technologies are immutable eternal but suffer from all the same problems current ones do but worse
"But the pretty tulips obviously have inherent value!" Even so, it's nowhere near their current market value in the middle of a giant bubble.

Something that helps me when I get too obsessed with a philosopher or a philosophical idea is to normalize them. The point of the below thoughts is to embrace the idea that no one has the secrets to the universe. The main rationalist ideas occupy interesting (or not…) places in ongoing and totally normal academic debates.

When rationalists present their ideas they are imbued with a heavy amount of creepy-ass rhetoric. Yud presents the Sequences as the key to rationality and the universe. Not only is this way oversellling, he makes totally undefended and questionable assumptions at practically every step (although I understand he deleted a lot of the original sequences? I haven’t looked at them in yeaers). SSC can’t write an argument to save his life that isn’t obscured by a million random-ass thoughts.

Normalizing rationalist ideas means grappling with them as just normal positions in a complicated academic dialectic outside of any bombastic rhetoric and writing. And to me this means four things.

First: Despite what many rationalists will say, there are lots of smart people who defend opposing ideas very well. So part of normalizing the rationalist sphere is to seriously explore serious thinkers who totally reject core rationalist ideas. Maybe you’ll come away realizing that the rationalist ideas you found convincing were just built off of empty rhetoric. For example, Carlo Rovelli rejects the MWI and defends his own account. And you can even watch Rovelli and David Wallace (a defender of the MWI) very reasonably and without any “secres of the universe” vibe debate the different theories on youtube. It’s possible if you’re really bothered by their utilitarianism, then reading some Christine Korsgaard might help. If the AGI stuff bothers you, lots of philosophers reject the possibility of true artificial intelligence of any sort. Some think intelligence is irreducibly biological (Searle) while other’s think the whole concept of AGI is based on two illicit moves: anthropomorphizing computers and computerizing the mind (PMS Hacker, Evan Thompson, Hubert Dreyfus, me).

Second: One of the creepy aspects of the rationalist sphere is the way a collection of different ideas are billed as a complete package and as the only “rational” package to buy. Part of normalizing the rationalist package is to recognize that the “glue” that the rationalists claim holds their ideas together can be pried apart. There are people whose work I like defending some versions of Bayesianism epistemology and who have nothing to do with the rationalist sphere. Liam Kofi Bright is basically an anti-rationalist and a defender of Bayesianism (I think he has also done work on trusting experts).

Third: a direct corrolary is that part of the rationalist bullshit is the creepy vibes and rhetoric. But by reading philosophers who agree with the rationalists on some idea but reject the rationalist cult, that may take away some of the culty psychological attraction their ideas have. Bayesianism epistemology is a hotly contested idea in philosophy that needs lots of defending!

Fourth: The rationalists have massive holes in their world view when it comes real important parts of human experience, specifically race and gender. So I would recommend reading feminist and race theorists. Currently I’m reading two feminist books, Think Like a Feminist by Carol Hay and We Are Not Born Submissive by Manon Garcia. With regards to the philosophy of race, I recommend The Racial Contract by Charles Mills.

The goals: I think ultimately my goal here is for you to really figure out what arguments you find genuinely convincing in the rationalist discourse and what arguments you were just duped into finding convincing because of all the bullshit rhetoric. It’s possible that you’ll come away from all this still thinking that some rationalist ideas are correct. Well that’s fine because the other goal is to show you that you can believe some rationalist ideas without being a kook. Although I think only kooks actually believe in the simulationist hypothesis.

I mean, let’s be real, my understanding is that Yud’s only real claim to originality in the academic community is the evil AI stuff. It’s an interesting idea, but in the grand scheme of interesting philosophical ideas, it strikes me as a little on the paltry side.

Ultimately, I think the biggest problem with rationalism is it’s an insular community masquerading as an open community. So seriously reading anything that challenges their core commitments or addresses their big blind spots might help get them out of your head.

You might consider reading “Cultish” by Amanda Montell:

https://www.amazon.com/Cultish-Language-Fanaticism-Amanda-Montell/dp/0062993151

This book documents the type of abuse of language that you’ve experienced by way of Rationalism. From my layman’s perspective, Rationalism is well on the road to developing the sort of thought-terminating cliches and overloaded language that allows cults to close off members from any external influence.

> thought-terminating cliches A concept I have seen often enough in Rationalist circles (And the concept itself can even be used as a thought-terminating cliche) , which just goes to show, being aware of some idea doesn't make you immune to that idea.

As other people have pointed out, the sequences build a case for disregarding scientists and experts in different fields, and instead put your trust in the power of Bayes as an entire epistemology. After all it’s hard to argue with math, right?

I doubt this alone will help, but there’s a world where you can still “trust in the power of Bayes” without being a rationalist.

First, note that Bayes theorem is (roughly speaking) a mathematical way to quantify how one should learn from experience. You have some prior preconceived belief, see some new evidence, and want to adjust your preconceived belief. No issues here.

The issue is in the world quantify. For the record, I’m not saying Bayes theorem shouldn’t use numbers! Just that in most “casual” appeals to it, the numbers are highly suspect, and should be the thing that is interrogated, not the use of a mathematical theorem.

This is perhaps best seen as a broader issue with utilitarianism, and at the core of any real nonsensical utilitarian argument (although there are other things as well). If I say that enslaving Ben Shapiro would give me 30 expected utilons, and (since I would make him do incredibly funny things) it would also give each person on twitter .1 expected utilon, and would only cost Ben -1000000 utilons then by linearity of expectation, we should bring slavery back.

Of course, I made up those numbers, and came to some conclusion as a result. If I made up different numbers, I could have come to a different conclusion via the same process. The step of making up numbers allowed me to smuggle in my opinion into the “simple mathematical calculation”, and gives the opinion a veneer of objectivity when it is nonsensical at face value.

You get precisely the same issue with Bayes theorem. In casual settings it is used to justify quantifying opinions that are made up, and one can change the end quantification by changing the initial ballpark numbers one throws into it, so it really becomes a way (rhetorically at least) to ” beef up” a qualitative opinion (which one smuggles in via the ballpark numbers) into a quantitative one, which people tend to view as more compelling.

For the record there are (formally) fucked up things with Bayes theorem as well (anyone who supports the idea of improper priors is highly suspect), but I would personally mostly view it as a convenient trick to do the “qualitative -> quantitative” conversion people love to do with it.

I saw that you mentioned Ben Shapiro. In case some of you don't know, Ben Shapiro is a grifter and a hack. If you find anything he's said compelling, you should keep in mind he also says things like this: >The Palestinian Arab population is rotten to the core. ***** ^(I'm a bot. My purpose is to counteract online radicalization. You can summon me by tagging thebenshapirobot. Options: civil rights, healthcare, sex, history, etc.) [^More ^About ^Ben ](https://np.reddit.com/r/AuthoritarianMoment/wiki/index) ^| [^Feedback ^& ^Discussion: ^r/AuthoritarianMoment ](https://np.reddit.com/r/AuthoritarianMoment) ^| [^Opt ^Out ](https://np.reddit.com/r/AuthoritarianMoment/comments/olk6r2/click_here_to_optout_of_uthebenshapirobot/)

“The” MWI is one of those ideas that you can afford to not take seriously, even if you get into quantum physics. I put “the” in quotes because there isn’t just one version, but at least half a dozen mutually contradictory variants. Even the people who do take it seriously can’t agree with each other about how to make a well-formulated, actually scientific theory out of it. And, of course, none of them have persuaded the people who prefer any of the other interpretations. Once you peel back the technical and historical inaccuracies, Yudkowsky’s proclamations on the subject turn out to be empty smugness more than anything else.

The only MWI I care about is the one that begins with Zion and Eden and I don't particularly care about that one. Don't text and drive, people.

Lots of good answers here but I’ll try to jam one into a little gap that others haven’t touched on as much: find better substitutes for the positive things that Rationalism did seem to offer you. You’re curious and you want to improve your own knowledge and reasoning - that’s great, so don’t give it up! Just find more fulfilling sources of input. For example, do you want to be more rational? Read Kahneman, which will show you the limits of your own intuition (but also the limits of our shared knowledge of how to overcome it). Want to understand the interpretations of quantum mechanics? I recently enjoyed Philip Ball’s Beyond Weird, which actually saves Everett’s interpretation for last because it’s a weird outlier compared with the other ones but first you have to understand which mechanical problems it actually resolves instead of just enjoying the implications for philosophy (and fiction).

Basically I’m saying you fundamentally have to give up on their “get smart quick” scheme of developing an internet-argument-ready level of knowledge about abstruse issues by reading rambling blog posts written by laypeople in their free time, and read actual books (or at least reputable and professionally edited and fact-checked magazines). For a lot of people that would involve a significant rearrangement of their lifestyle, but nowadays there are a lot of different ways you can fit it into yours: you can read e-books on a mobile device, even while listening to music, or listen to audiobooks and -articles (Audm) while doing other things that don’t keep your brain occupied.

Also, as the best comments say, absolutely don’t try to self-medicate your real diagnosable clinical disorders with internet posts. There is no substitute for trained help, even if you have to Rationalize it as finding someone who’s spent years of actual daytime work hours updating their priors instead of someone winging it with a good slogan they overheard from a friend of a friend.

I suffer from a number of mental health issues but by no means am I qualified to give professional advice on how to approach them. However, I will say this, that instead of attempting to completely divorce yourself from the ideas that you were exposed to, perhaps there is a way to make peace with them. Many people enjoy thinking about MWI or simulation theory or AGI ( Me for one and probably many people here) without buying into the implications the rationalists offer up about how these ideas influence our lives or will in the future. I’ve also suffered from an inferiority complex in my life and one thing I’ve found that helps is to remember the things that I do or the experiences I have that those “superior” others may not - things they are missing out on. Rationalists haven’t really accomplished anything as a group beyond getting rich silicon valley types to buy into their nonsense. But they drain alot of time into these activities. I don’t imagine they “smell the roses” too often.

Others have said great things about the false numeracy that Rationalist-style Bayes calculations deal in. But maybe I can help to defang your fears about Many Worlds.

What is my long-term goal, in the abstract? It’s to reshape the world to be the best possible world that I could live in. I want to fight bigotry and injustice because they make the world a worse place for me to live in, because even when they don’t hurt me personally, they hurt the people I care about. And “the people I care about” casts a very wide net. Seeing people suffer makes me sad.

Now, assuming MWI, what do my counterparts in other quantum branches want? Well, quantum branches occur when there are multiple ways that an individual particle could go. On rare occasions, those quantum effects bubble up to the macroscopic world… but the vast majority of them are utterly and entirely irrelevant to me and to the things I care about. So my quantum counterparts for the most part also have the same goals as I do. It’s just that each of us can only affect our own quantum branch.

Each of us, me and my counterparts, are working together to repaint the future from the bleak place it could have been to the bright place it actually will be. Our shared, strongly aligned desires are acting as a process by which the deterministic many-worlds future of the multiverse is decided, and our contribution to that calculation ensures that our values are reflected in that process.

But is this really so different from my relationship with my fellow human beings in this quantum branch? Many of them also have a vision of the future that aligns with mine. And w unlike my counterparts, I can support them and they can support me. That lets us accomplish even more than me and my counterparts could have done alone. There’s no sense putting too much thought into what my counterparts are doing, since we can’t affect each other; I should be focused on what I can change and the power that I do have.

I think it’s the most damaging part because it’s the reason why the other ideas are taken seriously at all. As other people have pointed out, the sequences build a case for disregarding scientists and experts in different fields, and instead put your trust in the power of Bayes as an entire epistemology. After all it’s hard to argue with math, right?

Oh yikers, that’s terrifying. I just casually browse this sub sometimes because it’s funny and find the rationalists sus, but I always thought their love of Bayes was probably fine or maybe even a rare point in their favor. As a social scientist I think we often need to be using Bayesian rather than frequentist statistics for reasons relating to the assumptions of the methods and the claims we’re generally trying to make…

but math as an epistemology is definitely a hazard. Others have left a lot of good notes on a few of the important topics here, but what I haven’t seen people talking about is that any type of math or logic is a socially constructed tool for representing the world, and while these tools can be useful you can’t put all your faith in them as ground truth. A classic story problem begins “you have two apples…”, but what does that actually mean? Those apples certainly aren’t identical in the real world, so can you really justify adding them like they’re multiples of the same object? What if you take a bite out of one?

Silly, I know, but that is just to point out that math is one way of simplifying the rich texture of reality so that we can make inferences about it, and can cover up details that other forms of representation or description might capture. No mode of representation or analysis like that is going to be perfect, and using a tool of representation like a certain branch of math or logic as your ground truth is bound to get you into trouble.

Additionally, any mode of analysis like that is going to allow room for subjectivity in how it’s actually applied in the real world. That is fine, we cannot be perfectly rational no matter how hard we try, but failing to acknowledge subjectivity is something rationalists are really good at. In the case of Bayes a lot of subjectivity is introduces in selecting priors. The apples example shows that subjectivity is introduced when you decide on a unit of analysis. We also introduce subjectivity in how we select the questions we are going to use Bayes to ask. I could ask “what race commits the most crimes?” and the answer would likely support a racist worldview, or I could ask “what factors lead individuals to be more prone to crime?” and I would likely get an answer that does not support a racist worldview.

Idk if I worded this the best, but hope it helps a little.

Maybe look into existentialism to deal with the whole simulation problem. Camus is pretty good.

Also the entire “rational” sphere has big “too smart for school” energy.

Yudkowsky claims to know several “secrets of the universe”, and if you mix this with the devotion to Bayes he builds in you, you get an strange aura around him

Sounds very cultish

I second this, came to comments to suggest Camus. Absurdism works for me
Life is like totally absurd and stuff
innit

I’ll be honest, I know little about the rationalist sphere, I just stumbled across this sub and joined because I liked the vibe.

I do know a bit about Shapiro and j peterman and to my understanding part of their schtick is that whole “removal of emotion” in their argument as some kind of stoic bravado that gives a false impression of being above it all. But the reality is they’re not relying on facts so much as they’re leaning into anger, contempt, and fear, which are the only acceptable emotions to show if you buy into toxic tropes of masculinity.

[This might help a bit](https://rationalwiki.org/wiki/LessWrong) if you want to figure out more about the Rationalist sphere Shapiro isn't a Rationalist btw (no idea who j peterman is). (And some rationalists do argue that [you shouldn't just remove all your emotions](https://www.lesswrong.com/tag/emotions), even if in practice they do often come off very much like budget Vulcans, esp some of the people who follow the movement, what you said about toxic masculinity gets even worse when you come to the spinoffs like r/themotte (content warning) (the main attraction of that sub is the culture war thread btw, don't get confused by the few topics there, you shouldn't miss the 2k comment+ weekly anger post stickied at the top). r/themotte isn't technically Rationalist however, it is just a spinoff created by Scott Alexander so his fans could keep talking about the culture war, it is a good example of how ideas of 'centrism' and 'niceness' can be abused to argue for horrible things however people regularly get convinced there of race science, eugenics, how bad trans people and gay people are etc etc. The empathy removal training center. The name is a reference to the motte/bailey fallacy (a useful fallacy, but they created it to attack the dastardly postmodernists (which prob were secretly post-structuralists, but nobody is keeping those terms straight except for huge nerds (somebody [more versed in phil](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/) might want to check if my intuition was correct)). One of the problems with Rationalism can be characterized as the old joke where people say don't have such an open mind that your brain falls out, because they declared bias as super evil, they reexamined (and often have to redo this over and over again) all ideas others have already dismissed (except the far left ones, odd that). Which also lead to their communities filling up with people who were fans of those ideas. Which, to their credit, at least on lesswrong some attempts were made to kick all these MRA/MIGTOWS/NRx ([neo-reactionaries](https://rationalwiki.org/wiki/Neoreactionary_movement), a special type of tech monarchist which basically only exists in Rationalist adjacent places and Dark Enlightenment places (which is basically NRx), not to be confused with reactionaries (the political leaning) or people who react to things) out. On slatestarcodex (lesswrong and slatestarcodex are the most popular hangouts of the Rationalist movement I would say) however they were welcome as long as they don't overwhelm the discussion (a pattern slatestarcodex repeats often white nationalists also get temp banned from time to time when they are too aggressive, while Marxbro got a ban because he discovered an antimarxist was misquoting Marx. (Slatestarcodex/Scott Alexander is also a semi neoreactionary however, but he kept this secret for a long time (semi because he wasn't a fan of the monarchism and think they should focus more on the racism). Musk is a fan/reader of slatestarcodex, if you want a bit of an 'oh no' moment). Now you know more about the various types of the Rationalist sphere. And do note, I'm biassed against these places, and I didn't mention all the bad shit they done. (a huge chapter could be written just about their anti-feminism alone for example, or on the weird thought experiments like [Rokos Basilisk](https://rationalwiki.org/wiki/Roko%27s_basilisk)). Welcome to this very weird subreddit.
J peterman is Elaine’s blowhard boss on Seinfeld lol I meant what’s his face, uh Jordan Peterson. Just making a funny with that one Shapiro I always see as kind of placed adjacent to the rationalist/debate bro space so he’s lumped in with the others in my mind My concept of philosophy honestly begins and ends with black American cultural theorists and more globally, post colonial theorists such as fanon. I understand that leaves significant gaps but I feel like it’s the philosophy that is most practical for navigating the current zeitgeist. It also seems to be the philosophy that is most likely to draw a knee-jerk reaction by so-called rationalists, which you can read how you want I suppose And I don’t mean to blow off your reading suggestions because I do like to understand cultural currents and how they inform mainstream conversations, but I’m not sure I can actually make it through one of those people’s books so I might have to skim over some summaries instead lol Plus I feel like you don’t need to fully understand the space to see the humor in anyone who refers to themselves as a philosopher as a reason why you should buy their book. The economic motivation of a cushy media career clearly trumps all
> J peterman Ahaha ow right, I was googling the guy and I came across a real life person with that name and went 'wait who is this guy?'. Shapiro and Peterson are more the intellectual dark web types. Different type of nut, even if some of their styles can overlap a bit. (I was talking about Rationalists instead of rationalists intentionally here btw, large R is actually a specific group of people (And this places is about the large R, even if most of us here prob also don't like the IDW, small r types). And I linked a few times to rationalwiki so that should be easy enough to read. But I do get why you wouldn't want to read any of that, one problem with Rationalists is that they love to blog. And blog. And blog. All their stuff is so very long. And you are right, you don't need to fully understand all of it (and considering slatestarcodex moved to substack for an unknown amount of money you are not wrong on the economic motivation part (Others like Yud get enough money from wealthy donors who believe that the robotgod is coming to kill us and he is the only one who can prevent us from being turned into paperclips). Just don't be surprised if you sometimes get a bit of pushback for posting the wrong thing here (usually easily fixed by just posting it in a different place like r/enoughidwspam), or if a few references go way over your head. E: From a certain perspective there is quite a bit of overlap between Rationalism and the IDW/Joe Rogan types. As they attract a certain type of people who love to go 'this is why the mainstream is actually wrong about things'. (this obv doesn't apply to OP here, who was interested in the other side of Rationalism, which has eventually percolated down into the IDW types).
Ah yes, I’m also vaguely aware of the intellectual dark web, but much like the real dark web, I’ve written it off as something that seems to exist mainly for people who are into cp and/or want to buy drugs but are afraid they might have to talk to black people if they do it the old fashioned way
Thank you for your service as a rationalism anthropologist.
Well that is in a way what people are doing here amateur anthropology (the results of which are as all amateur science bad of course lets not pat ourselves on the back). Some here even lived among them (have lw accounts ;) ). Prob should stop explaining this long bullshit to every new person who drops in here however.

I have some mental health issues that (among other things) cause me to experience an extremely tiresome amount of rumination and intrusive thoughts. And as you can guess, rationalist ideas and rumination are not a good combination.

I have nothing to add other than the fact that I’ve noticed that rationalism seems to jive people with OCD-like tendencies, I say that as someone with the condition. Maybe the worry that rationalist-inspired thoughts produce will lessen as you work with your psychologist more.

Read Justin E. H. Smith’s book Irrationality. It might not be a cure, but it might help.

Sorry can’t really help you here. I always found the whole linking to secret knowledge and their ‘math’ a bit meh. And I was never convinced by the basilisk nor simulation theory. The theory of infohazards has some merit imho, but probably only in my personal definitions so it isn’t that generalizable, and it often vary personal what can be an infohazard (and the concept of infohazards is one itself for some). For an example of something which I think is a general infohazard: I think people should be careful to not link to far right extremist literature without any comment (which sadly a lot of media does tend to do, and which discouraged by anti-extremism researchers (another sidenote here, I recall reading research long ago that if you read something, to understand it you have to at some level believe it, which can be dangerous)). But this isn’t something worried about in the Rationalist sphere, where apart from the two anti-FAQs, linking to bad shit is done without a blink of an eye.

Do hope you manage to figure it out however. I did once hear about psychologists at universities having more experience with these kinds of intrusive thoughts via logical math things, that might help.

What also might help is to realize that apart from funding from various billionaires (Who support Rationalism and NRx for obvious reasons) and fawning respect from contrarians (who love to have their ‘ha, my contrarianism is right’ idea confirmed), and a subset of tech workers looking to have their biasses confirmed(**), or science fiction fans, these people aren’t that important (And also, often just plain wrong (See how Yud just accepts MWI as true because it is good for the other ideas he has) or misguided), and in general wasting a lot of their time (see gwern site). Also well, their shit simply doesn’t work(*) (but that is more about the whole art of Rationality, which they seem to have mostly given up (when is the last time you saw ‘epistemic status’ for example, fuck I talked about that more here than they do) and not the mindworms (if you don’t mind me calling them that) you are suffering from.

Hope this helps a little bit.

*: This quote jumped out at me:

Rationality helps us choose which charities to donate to, and as Scott Alexander pointed out in 2009 it gives clarity of mind benefits.

As you can see, it doesn’t give clarity of mind at all (makes me think of various self help books which actually don’t help). The rest of the article is a massive, as the kids say, cope (we aren’t winning but we could be if we did this one small trick because we are already so good at things).

**: On the note of biasses being confirmed/contrarianism. A while ago Scott Alexander was going all ‘payday loans are good actually’, which turns out to just be based on flawed data from an ideological source. Isn’t relevant to your situation, but I just wanted to share this somewhere in sneerclub at least.

> I was never convinced by the basilisk I'm always shocked that any serious people (or people who even think they're serious) are fooled by the basilisk. It feels like a parody of Pascal's Wager to me, and in that sense it's pretty effective. It succinctly shows that you can make an argument for any random nightmare being possible and claim that you ought to structure your life around it because this random possibility is so scary it's only rational to treat it with more weight than literally anything else.
Some years ago I was of the opinion that free information sharing and free speech/marketplace of ideas works, because I looked at stuff and went 'nah nobody will ever believe that'. And I'm shocked at how often I'm proven wrong. (People prob also say this about a few of my opinions). Nowadays im a lot more aware of the fact that no, people will randomly believe weird shit. See [also this news article](https://www.businessinsider.com/why-you-believe-everything-you-read-2011-1) which I don't know if it is true, but as it confirms to my bias here, I can't not believe it (and yes that article is talking about a different sort of believing).
I totally get that journey wrt the market place of ideas. No matter how much I study or look at the world, I feel like I always have too much faith in people's baseline common sense. My girlfriend "oh honey"s me a lot when I assume that some dumb thing has to be a joke because who on earth would believe it.
> It feels like a parody of Pascal's Wager to me Oh, [they know](https://en.wikipedia.org/wiki/Pascal's_mugging)
I laughed out loud at this. Weird that in one of the later sections there's a bunch of arguments for rehabilitating naive mathematical reasoning for selecting behaviors rather than just acknowledging that any "rational" set of decision making rules can be trapped into making absurd choices and subjective discernment as a mediator is pretty unavoidable if you don't want nonsense like the wager or mugging... It's not even worth salvaging if you're into working on AI, where the most powerful systems don't use hard decision rules and instead discern on the basis of complex reward functions, and the gold standard across disciplines is basically always still some kind of human in the loop system...

Not sure what you need, but you might get some use from David Chapman’s (@meaningness) writings:

https://metarationality.com/rationalism

https://meaningness.com/collapse-of-rational-certainty

I’ve also assembled some of my own critical material: http://www.hyperphor.com/ammdi/pages/Rationalism.html

but honestly Chapman’s is much more systematic and probably better suited to a recovering rationalist trying to deprogram themselves.

No offence to yourself, but Chapman is still a gigantic fucking blowhard

best way to do it is to read reputable history books and great literature and poetry. when you do that you’ll feel something. After that point it’s smooth sailing. these people are not actually that smart, they’re just bigoted in a way thats rewarded by a bigoted society

Hey. Not sure how helpful it is, but i had a similar personal transition when leaving religious fundamentalism (before the “rationalist” movement picked up steam), and experienced many of the same difficulties in shifting away from the modes and habits of thinking that came with it.

Fundamentalism and internet rationalism both rely on “ideology laundering.” For the rationalists it’s Bayesian probability and for the fundamentalist christians it’s “the words of the Bible”: both use rhetorical card tricks to equate their assumptions and views with “math” or “the text of Scripture,” and frame others as arguing against that “neutral authority.” It’s critical to understand that you do not have to out-math someone to disagree with them or reject their premises because they conflict with your values or bring pain to your life. You might be wrong! But that’s not the end of the world. Trusting yourself is important; you may not be an authority on statistics but you’re an authority on what you value and believe and experience.

Also, if there are friends or trusted folks you can connect with in other communities (online or off) who can support you without being on any ‘side’ of these philosophical debates, it can be a huge help.

Jungian psychoanalysis. Jung dealt with exactly this.

This is a bit out of my wheelhouse, and I don’t want to offer you any bad advice, but I hope you feel better. The only thing I can suggest is having an open mind and trying to be humble and try and discuss things as much as you can.

Check out subreddit overlap with that community:

https://subredditstats.com/subreddit-user-overlaps/slatestarcodex

They’re heavily biased and in their own little internet echo chamber, don’t put too much stock in their irrationality.

Bayes’ theorem and probability theory in general is meaningful only if you have a correct probabilistic model of the reality (or system), which you want to model.

Watch this video about Bertrand’s paradox: https://www.youtube.com/watch?v=mZBwsm6B280 : Depending on the small details of the probabilistic model, one can get very different results. And this is what we get in the math context, where things are very deterministic and perfectly measurable.

It is nearly impossible to make a reasonably correct probabilistic model when talking about anything where humans or human decisions are involved. So if someone claims to use Bayes’s theorem in their decision making, they lie ¯\_(ツ)_/¯