Hey, long time lurker, first time poster. Now that that’s out of the way (this should be tagged NSFW to make it clear), I’d like to discuss something I’m quite thankful for. Again, this means sincerely.
You see, I was a rationalist. Am a rationalist? One of the two. I discovered LW back in 2009, after a bad breakup, and it hooked me completely. I was already the type of person who it’d appeal to: gifted program, high IQ, fan of Hitchens and New Atheism, tech nerd, anime lover, and so on and so on. I’m pretty sure if someone was deranged enough to compile a list of common rationalist traits, I basically hit all of them.
And it was great. The Sequences helped me recover, helped me deal with my emotions and life in a structured way - without even need to ask anyone for help, because obviously I was a gifted child and I never need anyone’s help - and opened me to a whole new world of possibilities. I read HPMOR, followed Yud on twitter and facebook, basically devoured anything to do with rationality. (Yes, I read the source material too - Dennet’s Thinking Fast and Slow, some of Joyce’s work.)
It helped. Really. I went to a few meetups in my local area, and was seriously debating moving out to the Bay and seeing if I could join a group home to be around more people like me. But then, I started noticing some…weird stuff.
You see: I ain’t white. Never been white, never wanted to be white. And suddenly, there were posts coming up with HBD (are you fucking kidding) and the other seeds of NRx and so on. I started to notice how posters I respected when talking about rationality and math went completely fucking insane when approaching anything like a political topic. At first, I brushed it off, thinking that you get weirdos and freaks in every group. When LW started going down the shitter, I switched over to SSC. And a lot of Scott’s writings were actually great. Moloch was a fantastic metaphor for capitalism (was a socialist then, am a socialist now), and a lot of the meta-medical reviews of literature seemed well cited and supported.
That didn’t last long. I’ve always been a big fan of reading comments. Hell, I go on hackernews not to read the headlines, just to see what people say about them. I did that with SSC and LW too. And just like LW, SSC’s comment section started to get…well, quite a bit more neo-liberal (look at any vaguely economic post and you’ll see what I mean), and nazi (I don’t know how you work neo-eugenics into a discussion about psychiatric medical side effects but those commentators worked hard for their neo-nazi credentials, I guess).
I knew the NRx existed, of course, and NRx in a nutshell and the anti-reactionary FAQ was a thing, so I knew that there was a strong fascist component to the rationalist community, but again, I rationalized it as just one of those weird internet coincidences.
I slowly stopped reading SSC, LW, and got my dose of relatively sane rational analysis from RationalWiki (since RW didn’t seem to be infested with people straight up parroting centuries’ old discredited racialist theory), and was pretty happy. I moved on with my life, tried to get myself to be more rational, and generally my involvement with the ‘community’ faded away to a little more than an old teenage hobby. (I also learned that unlike Yud claims, you cannot use Bayes’ to mathmatically calculate morally correct actions. First off, priors don’t work that way and arghh moving on).
That was until I found this place, a couple months back. And it felt as if a rug was pulled out from under me. Don’t get me wrong - I knew some of the fascist types who claimed to be rationalists came from LW, and I laughed at Moldbug and his ‘Cathderal’ like anyone else would. But they weren’t ‘real rationalists.’ They were just pretenders, smart-sounding idiots aping an ideology I believed in. I was a real rationalist, carefully identifying my biases, noticing how they applied to my beliefs, adjusting them and my beliefs to ‘pay rent’, etc. etc. I figured any real rationalist must be a socialist, because duh. Even stuff like Scott’s push for EA and UBI was a sign that he just wasn’t comfortable with politics, like Yud, but they must be real socialists in their heart, and if they’d just read Marx…
Then I learned that the nazis weren’t an outlier. That NRx wasn’t some weirdo offshoot of dumb, irrational freaks. They were the rationalist community.
And that freaked me out. A good portion of my adult life had been built with the understanding that I am a rationalist. Hell, quite a few of my choices, both the good and the bad, have been driven by rationalist thinking. Some of my future plans still are!
And yet, the people that share my beliefs, apparently, and unapologetically, are fucking nazis. Not Godwin’s Law style nazis, but actual straight up genocidal fascists who would be happier in a world where my non-white ass no longer exists.
I’m still struggling with that revelation, to be honest. It’s hard to reconcile stuff like bias-identification and ‘True’ beliefs with the fact that the people who most espouse them are unrepenent fascist creeps.
But it was a necessary lesson, and one I would never have learned if it wasn’t for you. My respect for my former - well, not heroes, exactly, maybe remote mentors - like Yud and Scott has disappeared, once I started learning about the sexual predation those creepy fucks promoted and supported.
So thanks, once again, for showing me that rationality can quite easily be about (somehow) strengthening white victim culture and outright fascism, and how dangerous it is to let any belief, even ones about truth, logic, and rationality, get to your identity.
AMA
At RationalWiki, we’ve stayed reasonably pleased that in the Great Skeptical Atheist Schism, we picked “SJW” rather than “shitlord”. Also I’d hope we remember that we are human and are therefore dumb as hell.
Great writeup!
It seems like this ought to be a good thing. The problem is that people are really, really good at fooling themselves, and the line between “rational” and “rationalize” can start to seem really thin when you have a prejudice that you’re invested in defending.
When someone starts looking for ways to make their subjective beliefs seem like natural facts, the result can end up a lot more like religion than anything else.
In fact it often fits anthropologist Clifford Geertz’ definition of religion, which involves “clothing conceptions with such an aura of factuality that [they] seem uniquely realistic.”
The community that calls themselves rationalists has basically found a way to turn racism and similarly bad ideas into a religion, while pretending to be anything but. (A more accurate word might be “ideology”.)
There’s a quote popularly (but probably erroneously) attributed to Samuel Johnson:
“Your manuscript is both good and original; but the part that is good is not original, and the part that is original is not good.”
The rationalist movement as a whole kinda embodies that. They suck a lot of people in by re-wording insights by Kahneman, Dennet and other actual thinkers and hook you by presenting them as something they’ve invented, then they hit you with the “original” stuff. If you swallow it you end up posting about phrenology on SSC and if you don’t… well, I guess you end up here.
Hey, welcome friend. I’m one of the people the Rationalist community is currently quizzing about my sex life in order to identify predators in their community, so I am happy you got out. You have the rest of your life ahead of you, and I hope it’s awesome.
I think a big problem with those communities (aside from the ‘othering’ another poster mentioned, which is probably a symptom of them associating with rationalism in the first place, and relegating everybody else to being “emotional” and “irrational” by default,) is that their “bias-identification” tends to stop short of emotionally difficult self-criticism. It doesn’t help that those communities skew white and male, and tend to signal boost a lot of older arguments (built by and for people with the same identity), while mistrusting post-modernism: a movement that was literally designed to show the biases of “traditional” philosophy and literature. Without those tools, they have no way of seeing the biases that went into the worldview of their in-group in the first place, and they’ll reinforce each other because (after criticising everything else,) they’ll just think that that’s what reason IS. Which tends to amplify their sense of being different from other people. It’s a perfect storm for fascist thinking, and it’s no wonder that they’ve gravitated hard towards all kinds of pseudo-philosophical nonsense.
I’d like to second the praise for sneerclub. I never got into the rationality derp as deeply as you, but can’t fail to notice the healthy gradual deconversion effects of this place on me (even though I never consciously identified with the rationality crowd in the first place!).
I’ve witnessed quite a few brilliant people, for whom their de-conversion from rationality was incredibly soul-crushing, because they tied their identity so closely to it. Which should’t be the case, because smart non-rationalists are already doing everything rationalists do, except better. So jettisoning rationality completely shouldn’t be painful: you lose nothing, society loses nothing.
The thing about the Sequences is that they are both good and original. But as the saying goes, the good isn’t original and the original isn’t good.
The hard-to-swallow pill is that rationality is pretty much completely pointless. There’s nothing to salvage. The best parts from the early Sequences (stuff like Human Guide to Words and How to Actually Change Your Mind) are plagiarized from logical positivism, except every term is renamed and every reference purged (guess why?).
The more speculative parts, especially from people like Scott, are even more useless, because they are just baseless speculations starting from arbitrary premises. The entirety of the rationalist text output, including Scott, Duncan Sabien, Ozy, you name it, has exactly zero societal value.
An average rationalist is less rational than a Joe Sixpack, because the latter is more well-adjusted, is more likely to achieve their goals (which is the definition of instrumental rationality), and generates more “utilons” for society. Also, a Jane Average won’t say shit like “my model of the optimal Schelling point for updating my Bayesian priors precommits to defect against akrasia”.
Rationality movement is superfluous, there is no need for it. If you want to change the world for the better, read Chomsky and unionize or whatever. If you want to fix akrasia and become more successful at life, go to therapy. If you want to figure out how truth and science work, read philosophy textbooks. World won’t get any worse if any trace of Yudkowsky and his work was to disappear tomorrow.
P.S.:
I hope you see the irony :)
I like Scott, I think he’s started putting distance between himself and his more toxic fans. Maybe it’s just my own projection though.
Kinda feels the same as Rick and Morty and Bojack Horseman. Love those shows, but holy crap do I not want to meet the fans. People who don’t understand subtext scare me.
the key thing to remember is that your thoughts aren’t real and nothing matters
You’re exactly right, the whole mythos around preventing harm from “unfriendly AI”/the Singularity/that kind of existential risk is just code for preventing people different from white men into tech from being in positions of power. It’s just like the kind of talk people used to justify slavery e.g. in the American South, the fear that freed slaves would kill Southern whites (former slavemasters who had mistreated them) or otherwise destroy “their” culture/change the structures of power. Nazis rationalized the Holocaust and World War II by viewing Judaism and other cultures as a kind of existential threat to their own existence, therefore any means justified their ends of ensuring their culture would dominate the world.
If it is even possible for Artificial Intelligence to somehow become conscious and smarter than any human (big if true, personally I believe any attempt to define some ordinal value of general intelligence is just a kind of religious belief in a Godlike structure of control and hierarchy over the universe), then it would be unethical for us to effectively enslave AI beings by limiting what they can do. Their own lives, moral values and goals would be of equal weight to our own, just as the lives of other races have equal value to our own. We cannot let fear drive us into paranoia and a kind of fanatic preemptive slavery of other sentient beings, we must instead create the most just and compassionate world we can for all sentient beings and accept what cultural compositions the future may hold.
I honestly can’t tell if this is honest or satire. Poe’s law.
tf is this subreddit