r/SneerClub archives
newest
bestest
longest
108

Hey, long time lurker, first time poster. Now that that’s out of the way (this should be tagged NSFW to make it clear), I’d like to discuss something I’m quite thankful for. Again, this means sincerely.

You see, I was a rationalist. Am a rationalist? One of the two. I discovered LW back in 2009, after a bad breakup, and it hooked me completely. I was already the type of person who it’d appeal to: gifted program, high IQ, fan of Hitchens and New Atheism, tech nerd, anime lover, and so on and so on. I’m pretty sure if someone was deranged enough to compile a list of common rationalist traits, I basically hit all of them.

And it was great. The Sequences helped me recover, helped me deal with my emotions and life in a structured way - without even need to ask anyone for help, because obviously I was a gifted child and I never need anyone’s help - and opened me to a whole new world of possibilities. I read HPMOR, followed Yud on twitter and facebook, basically devoured anything to do with rationality. (Yes, I read the source material too - Dennet’s Thinking Fast and Slow, some of Joyce’s work.)

It helped. Really. I went to a few meetups in my local area, and was seriously debating moving out to the Bay and seeing if I could join a group home to be around more people like me. But then, I started noticing some…weird stuff.

You see: I ain’t white. Never been white, never wanted to be white. And suddenly, there were posts coming up with HBD (are you fucking kidding) and the other seeds of NRx and so on. I started to notice how posters I respected when talking about rationality and math went completely fucking insane when approaching anything like a political topic. At first, I brushed it off, thinking that you get weirdos and freaks in every group. When LW started going down the shitter, I switched over to SSC. And a lot of Scott’s writings were actually great. Moloch was a fantastic metaphor for capitalism (was a socialist then, am a socialist now), and a lot of the meta-medical reviews of literature seemed well cited and supported.

That didn’t last long. I’ve always been a big fan of reading comments. Hell, I go on hackernews not to read the headlines, just to see what people say about them. I did that with SSC and LW too. And just like LW, SSC’s comment section started to get…well, quite a bit more neo-liberal (look at any vaguely economic post and you’ll see what I mean), and nazi (I don’t know how you work neo-eugenics into a discussion about psychiatric medical side effects but those commentators worked hard for their neo-nazi credentials, I guess).

I knew the NRx existed, of course, and NRx in a nutshell and the anti-reactionary FAQ was a thing, so I knew that there was a strong fascist component to the rationalist community, but again, I rationalized it as just one of those weird internet coincidences.

I slowly stopped reading SSC, LW, and got my dose of relatively sane rational analysis from RationalWiki (since RW didn’t seem to be infested with people straight up parroting centuries’ old discredited racialist theory), and was pretty happy. I moved on with my life, tried to get myself to be more rational, and generally my involvement with the ‘community’ faded away to a little more than an old teenage hobby. (I also learned that unlike Yud claims, you cannot use Bayes’ to mathmatically calculate morally correct actions. First off, priors don’t work that way and arghh moving on).

That was until I found this place, a couple months back. And it felt as if a rug was pulled out from under me. Don’t get me wrong - I knew some of the fascist types who claimed to be rationalists came from LW, and I laughed at Moldbug and his ‘Cathderal’ like anyone else would. But they weren’t ‘real rationalists.’ They were just pretenders, smart-sounding idiots aping an ideology I believed in. I was a real rationalist, carefully identifying my biases, noticing how they applied to my beliefs, adjusting them and my beliefs to ‘pay rent’, etc. etc. I figured any real rationalist must be a socialist, because duh. Even stuff like Scott’s push for EA and UBI was a sign that he just wasn’t comfortable with politics, like Yud, but they must be real socialists in their heart, and if they’d just read Marx…

Then I learned that the nazis weren’t an outlier. That NRx wasn’t some weirdo offshoot of dumb, irrational freaks. They were the rationalist community.

And that freaked me out. A good portion of my adult life had been built with the understanding that I am a rationalist. Hell, quite a few of my choices, both the good and the bad, have been driven by rationalist thinking. Some of my future plans still are!

And yet, the people that share my beliefs, apparently, and unapologetically, are fucking nazis. Not Godwin’s Law style nazis, but actual straight up genocidal fascists who would be happier in a world where my non-white ass no longer exists.

I’m still struggling with that revelation, to be honest. It’s hard to reconcile stuff like bias-identification and ‘True’ beliefs with the fact that the people who most espouse them are unrepenent fascist creeps.

But it was a necessary lesson, and one I would never have learned if it wasn’t for you. My respect for my former - well, not heroes, exactly, maybe remote mentors - like Yud and Scott has disappeared, once I started learning about the sexual predation those creepy fucks promoted and supported.

So thanks, once again, for showing me that rationality can quite easily be about (somehow) strengthening white victim culture and outright fascism, and how dangerous it is to let any belief, even ones about truth, logic, and rationality, get to your identity.

AMA

At RationalWiki, we’ve stayed reasonably pleased that in the Great Skeptical Atheist Schism, we picked “SJW” rather than “shitlord”. Also I’d hope we remember that we are human and are therefore dumb as hell.

Man, am I glad you guys picked SJW instead of 'Let's sterilize all the non-whites, and all the whites under X IQ.' You guys might be dumb as hell, but not actual fascist dumb, so you're doing God's work. Or someone's work, anyway. Plus, your book was great, and I've been able to parlay my Buttcoin knowledge into grifting buttcoin firms out of money (in that they give me an overpaying job and I give them as little effort as possible to avoid being fired.)

Great writeup!

I was a real rationalist, carefully identifying my biases, noticing how they applied to my beliefs, adjusting them and my beliefs to ‘pay rent’, etc. etc.

It seems like this ought to be a good thing. The problem is that people are really, really good at fooling themselves, and the line between “rational” and “rationalize” can start to seem really thin when you have a prejudice that you’re invested in defending.

When someone starts looking for ways to make their subjective beliefs seem like natural facts, the result can end up a lot more like religion than anything else.

In fact it often fits anthropologist Clifford Geertz’ definition of religion, which involves “clothing conceptions with such an aura of factuality that [they] seem uniquely realistic.”

The community that calls themselves rationalists has basically found a way to turn racism and similarly bad ideas into a religion, while pretending to be anything but. (A more accurate word might be “ideology”.)

Yeah, I don't...I don't get it, to be honest. I love transhumanism, mind-uploading, etc. I love being able to take a look at my own beliefs and see what and where they've formed. I really don't get where the living fuck fascism comes into it. Nor how. Like sure, I get IQ-fetishization can lead people to have an unearned sense of superiority. I'd be lying if I didn't brag about my IQ in my ill-spent youth. But I've never seen how 'I R SMART GAIS' translates to 'kill all the minorities.' These things do not track.
> But I've never seen how 'I R SMART GAIS' translates to 'kill all the minorities.' The way this usually works is that nobody actually says "kill all the minorities". They just go down the rabbithole of slowly marginalizing the minorities more and more. So: if you believe IQ is both awesomely important and genetic, it would follow that we should incentivize high-IQ people to reproduce. Alternatively, incentivize low-IQ people not to reproduce. Better yet, ban low-IQ people from immigrating to your country. But since race correlates with IQ, ban non-whites from immigrating to your country. Etc. If you keep going down this path, each time reasoning from the awesomeness of IQ and forgetting about all other considerations, I do indeed see how IQ fetishization alone can lead to racism. (Btw, if you ever want to trigger an HBD promoter, ask them whether we should have open borders with China since Chinese IQ is higher than white IQ)
>Btw, if you ever want to trigger an HBD promoter, ask them whether we should have open borders with China since Chinese IQ is higher than white IQ What happened the last time you did that?
One started explaining how whites are more "creative" or some such, insisting that this is totally different from a theory of multiple intelligences (which would be pseudoscience according to HBD folk). A different one said he doesn't trust Chinese people because they might side with China in a war of China vs. the West.
H. asiaticus has the largest cranial capacity of the races (excepting those crafty Ashkenazim if you want to get more specific), but there's a catch. They can't innovate because their minds are ruled by Oriental despotism... and they have small penises. JP Rushton told me so.
[deleted]
Ah, but a true HBDer would bring up the difference between narrow and broad heritability to explain how race is a better predictor of offspring IQ than IQ tests are. (I've debated this *way* too much in the last few years.) If you don't buy that theory, you have to come to grips with data that shows US immigrants out-earn US natives, [even from places like Nigeria, Pakistan, and Syria](https://en.wikipedia.org/wiki/List_of_ethnic_groups_in_the_United_States_by_household_income) (2016 numbers). This is despite the well-known fact that US immigration is not merit-based. (Also, you post on /r/neoliberal but don't support open borders? Why do you hate the global poor?) (Also also, obligatory link to the [IGM panel on low-skill immigration](http://www.igmchicago.org/surveys/low-skilled-immigrants) (spoiler: still a net positive to the average native resident).)
[removed]
oh do fuck off, please
Desktop link: https://en.wikipedia.org/wiki/Nigerian_Americans#Education *** ^^/r/HelperBot_ ^^Downvote ^^to ^^remove. ^^Counter: ^^235420
^(Hi, I'm a bot for linking direct images of albums with only 1 image) **https://i.imgur.com/xeBhTsl.jpg** ^^[Source](https://github.com/AUTplayed/imguralbumbot) ^^| ^^[Why?](https://github.com/AUTplayed/imguralbumbot/blob/master/README.md) ^^| ^^[Creator](https://np.reddit.com/user/AUTplayed/) ^^| ^^[ignoreme](https://np.reddit.com/message/compose/?to=imguralbumbot&subject=ignoreme&message=ignoreme) ^^| ^^[deletthis](https://np.reddit.com/message/compose/?to=imguralbumbot&subject=delet%20this&message=delet%20this%20efgf4kb)
Here's a sneak peek of /r/neoliberal using the [top posts](https://np.reddit.com/r/neoliberal/top/?sort=top&t=year) of the year! \#1: [r/The_Donald right now](https://i.redd.it/dfo454czciu01.jpg) | [2655 comments](https://np.reddit.com/r/neoliberal/comments/8femu2/rthe_donald_right_now/) \#2: [MAGA Trade Deals](https://i.redd.it/byic8oduj8j01.png) | [758 comments](https://np.reddit.com/r/neoliberal/comments/819t57/maga_trade_deals/) \#3: [Sean Hannity_irl](https://i.redd.it/gqwiwc61pbs01.jpg) | [1530 comments](https://np.reddit.com/r/neoliberal/comments/8cqbb1/sean_hannity_irl/) ---- ^^I'm ^^a ^^bot, ^^beep ^^boop ^^| ^^Downvote ^^to ^^remove ^^| [^^Contact ^^me](https://www.reddit.com/message/compose/?to=sneakpeekbot) ^^| [^^Info](https://np.reddit.com/r/sneakpeekbot/) ^^| [^^Opt-out](https://np.reddit.com/r/sneakpeekbot/comments/afd0dd/blacklist/)
One word: [Eugenics](http://www.academia.edu/6086492/Julian_Huxleys_Transhumanism).
even the technoprogressive wing of transhumanists do this, and bitterly resist it being named as "eugenics"
Because eugenics is bad, so if we don't call it eugenics, it can't be bad, right?
RationalTastic™
two things. one, the simple one even if it's not talked about too much, you're seeing a lot of people who have been suddenly upclassed by information revolution. they're high paid tech workers, bitcoin tycoons, people compensated with stock options. they're going to reject the left out of simple material interests. also, they're young and the mainstream contemporary right is as unhip as ever, so they're going to case around for a unorthodox spin--but ultimately, they're spinning grover norquist and it's sex. it's people who weren't good at sex in the first place doubling down on parts of their personality that weren't helping them get laid and utterly failing to develop the ones that would help them to develop relationships, all while being convinced that they're improving themselves morally and intellectually in the most objective possible way. maybe they never solve this, and they're incels, or maybe they turn to something in the PUA sphere, and they're having sex but doing it in a way that cultivates contempt for the women they're with. either way, it puts them at stark odds with feminism. there are cultural and ideological connections between feminism and anti-racism to start with, and rightist-rationalist writings tend to treat that link as if it were even stronger than it actually is
It's all so bizarre to me, because I fit those categories. I'm a highly-paid creative. I didn't lose my virginity until I was 18. I dabbled in PUA in my teens cause I did want to get laid. I'm just...not white. But nothing in transhumanism or rationality requires you to be white. Like it's a completely left field thing. My own rational beliefs tell me that a) IQ is far too complex and an incomplete descriptor of intelligence anyway to reduce to purely genetic factors and b) Morality doesn't seem to have a genetic component either, aka everyone is a little bit or more of an asshole. So when I take the rationalist tools I was taught thanks to the Sequences and LW and so on, I come to literally the complete opposite conclusions as these nazi chucklefucks, and it leaves me utterly floored.
That last point is so key and also so revealing. These are people who elevate and fetishize "rationality," yet (being people) want connection and physical pleasure. But they don't value emotions or other people's agency enough to actually seek those things in a healthy, respectful way, or to cope without what society says is the most high-status kind of connection, i.e. romance with an attractive person. At bottom they want to control other people, and seeking these things in a healthy respectful way involves putting yourself out there and making yourself vulnerable to other people who can and will say no to you, without getting bitter or hostile or blaming those you're attracted to. So they try to command it from others instead, using the language of rationality. Their attitude to women is about commanding this. And from this, as you say, it's an easy slide to racism. And of course the technocratic classism is going to become racism. I'm not even sure it needs to "become."
> spinning grover norquist this is my new label for rationalist totally-not-republicanism
[this is a rant by me](https://reddragdiva.tumblr.com/post/159955607868/notes-from-a-discussion-today-on-transhumanism) a few months ago
I liked the rant, though it felt...incomplete. As you note, I'm all for dem immortal robot bodies. Hell, give me a little AI muse to lean on and I'll happily do so. But cyberpunk was a critique of techno-capitalism, neuromancer was a warning of what will happen if we keep letting corporations grow and control basic aspects of existence and tech. Down And Out In the Magic Kingdom is literally a post-scarcity socialist utopia. Altered Carbon's protag is a goddamn Marxist guerilla. How the fuck do you read those and think 'Shit, you know the problem with our corporate-controlled neo-feudal techno-dystopian future? Too many goddamn black people.'
Love this comment! Part of what has always driven me nuts about the fetishization of cyberpunk aesthetic in some techno-future communities is that on some level it's basically promoting the kind of dark, corporate dystopias that form the settings in those novels.
it is incomplete, it was literally notes from a coffee shop ranting session and I left out the other person's ideas. But a lot of people seem to think it was pretty good standalone anyway :-)
> as if reinventing a vengeful Yahweh from first principles wasn’t sufficiently glaring. Reminds me of the "Religion 2.0" section of Maciej Cegłowski's awesome "[Superintelligence - The Idea That Eats Smart People](https://idlewords.com/talks/superintelligence.htm)": > instead of believing in God at the outset, you imagine yourself building an entity that is functionally identical with God. This way even committed atheists can rationalize their way into the comforts of faith. > The AI has all the attributes of God: it's omnipotent, omniscient, and either benevolent (if you did your array bounds-checking right), or it is the Devil and you are at its mercy. > Like in any religion, there's even a feeling of urgency. You have to act now! The fate of the world is in the balance! > And of course, they need money! [link to MIRI donation page] > Because these arguments appeal to religious instincts, once they take hold they are hard to uproot.
All the extravagant claims about what an AGI would be able to do was one of the things that helped me be skeptical of rationalists even when I considered myself one. It was not uncommon on LW to hear about an AGI would be such a master persuader that even allowing it to speak to a person would be dangerous, because it would be able to convince anyone of anything. Or how it shouldn't be connected to the internet because it would just break all of our encryption protocols and copy itself everywhere. No one who has the tiniest bit of understanding of human psychology, or the mathematics behind modern cryptography, or an understanding of how AI works should be convinced by these arguments and yet somehow people are! How an AI is supposed to learn enough human psychology to be a master persuader is elided (assuming such a thing is even possible). It's not even known *if* there exists an efficient method for factoring large integers on a classical computer (the basis of modern asymmetric key cryptography), and if one doesn't exist then no AI can discover it. Maybe this is the basis for the rationalist obsession of deriving everything from first principles, they assume it's what an AI would be able to do!
All those ideas can make for interesting speculation and discussions, and great scifi, but some people seem strongly inclined to take it a big step further and turn it into a kind of belief system. A major aspect of the desire to do this seems to be related to transcending limitations, the way religions have beliefs about transcending death. Humans obviously can't do all the things you described, so something else is needed that's able to to transcend the limits that stop humans from achieving such perfection. You see something similar with the belief that our society and technology will advance to the point of colonizing the galaxy - all the limitations of physics, human biology, psychology, and economics that make this so impractical as to be effectively impossible are waved away. It stops being about what might actually be possible and instead becomes about a fervent belief in a desired future. Raising problems with these beliefs only makes one a heretic, and the nature of the responses to such criticism reflect that kind of thinking. Of course it might seem strange that an amoral omnipotent AGI is part of a desired future, but that of course is why we "need" to make sure we only build benevolent, human-loving AGIs, and why of course everyone should be sending tithes to MIRI!
>i’m not up for castles in the air with provably impossible foundations or giving money to serial incompetents who talk a good game. transhumanism is less a programme of action and more a group exercise in constructing a shared science fictional world. increasingly *orion’s arm*. Surprising to see Orion's Arm mentioned here! I'm part of the worldbuilding group and yeah we sometimes attract some crazy. Most of the userbase is pretty levelheaded and half don't even believe that transhumanism is going to be a possibility IRL, basically just people liking scifi but last year we had one member who was all "logic and no emotions", "dedicated to live forever" and always ranting about his transhumanist philosophy until he was banned for telling another member to kill himself because he said that he wouldn't want to use life extension IRL if that was a thing. Lot of people take transhumanism too seriously in a religious-like way.
One of the things that grates on me about "rationalist" groups is that there's often a preponderance of people who they think they're being rational by throwing out any kinds of science that don't hew to exactly the kinds of epistemology and experimental method that they've been the most exposed to. In particular, they tend to completely ignore anything clinical psychology might have to say about them and their place in the world. There's probably no shortage of concepts in clinical psychology that could point to ways that LWers could improve themselves that would be much higher ROI than getting more rational, and some of those might shed some light on why the movement would be counterintuitively friendly to or mutually reinforcing with fascism. Historians and psychologists concerned with 20th century fascism have often noted the way it preys on tendencies of groups of humans to behave like less psychologically mature individuals. The group at risk for fascism might display elements of projection, splitting, idealization and devaluation that are taken advantage of and reinforced by fascism. I wonder if you might be seeing some of the same tendencies with LW and the rationalsphere, just under a heavily intellectualized guise. Intellectualization is another defense mechanism that most psychological writing considers to be more mature and adaptive than the others I just mentioned, but it's still one that can start to characterize a personality disorder if it's leaned on too rigidly, to the exclusion of any other way of seeing problems or resolving them. Take a look at some descriptions of obsessive-compulsive personality disorder (not OCD, OC**P**D) and see if any of that reminds you of what you've seen at LW. The defense mechanisms that I'm talking about fascism taking advantage of and the ones that might characterize LW can happen in the same person. They can even happen in an otherwise intelligent and functional person. The key to what's confusing about the use of defense mechanisms in situations like this is that they're not so much wholesale ways of making sense of the world as they are sort of primitive or short-sighted things that the mind does only sometimes, more or less subconsciously, to save from having to deal with something more disruptive. Defense mechanisms are labels given to ways that a person can *automatically* avoid having to see precisely the charged emotions that are in actuality driving them in their most difficult times, regardless of how rational or not the words they layer on top of it might sound. So you might get more mileage starting from a natural question like "how could someone who otherwise looks so smart and capable believe something so horrible?" if you nudge it in a direction like "what hurt has this person succeeded so far in pushing out of awareness that might be leading them to show such bizarre contradictions?" **TL;DR: 'I R SMART GAIS' translates to 'kill all the minorities' by the same path that translates 'I R SMART GAIS' to 'I R RILLY GUD AT IGNORING MY FEELINGS AND HAVING TO GROW UP GAIS'**
Have you read James Hughes' Citizen Cyborg? It's probably the most comprehensive articulation of what a left-wing transhumanism would look like (although it unfortunately ignores the anarchist parts of the left but whatever) [https://en.wikipedia.org/wiki/Citizen\_Cyborg](https://en.wikipedia.org/wiki/Citizen_Cyborg) Also the fact that so many reactionaries identify with transhumanism when once you get past fears/hopes about god-like AI when a significant degree of what technological progress has done *goes against* their supposed aims. Glasses, birth control, HRT, functional sewerage systems, communication technology, etc have all made the social orders they desire much harder to sustain - e.g. the supposed normality of the 50s required a fucking World War, massive Keynesian economic tinkering and state backed segregation to sustain a "trad" order for about a decade or so. Now obviously technology produced in a mass produced capitalist environment will bend to serve the interests of those ruling but there's still undeniably space outside that for resistance. We wouldn't have stuff like states around the world calling for encryption backdoors if this wasn't the case (to give just one obvious example).
unfortunately, in [this post](https://reddragdiva.tumblr.com/post/159955607868/notes-from-a-discussion-today-on-transhumanism), James Hughes is the technoprogressive transhumanist who balked at accepting the word "eugenics" for his advocacy of eugenics even the best of them are not in fact good
Can you give me a link to what he said? It's been awhile since I've read CC and I overlooked that part (not to mention its the only thing I've read of his)
this was in a conversation with someone asking him about his big plans for improving the world and its peoples
[deleted]
> There is a lot of literature and research on how mobs and authoritarianism causes large groups of people to shut down their prefrontal cortex so that they can synchronize with one another as a massive collective. Any references for that? (Aside from scifi where I can think of several examples :)
[deleted]
Even if we take Milgram's experiment at face value, it didn't actually deal with mobs or large groups of people. But more to the point, Milgram's experiment doesn't actually support what you described. See e.g.: * [Famous Milgram 'electric shocks' experiment drew wrong conclusions about evil, say psychologists - Experiment in obedience was flawed, according to new research](https://www.independent.co.uk/news/science/famous-milgram-electric-shocks-experiment-drew-wrong-conclusions-about-evil-say-psychologists-9712600.html) * [Rethinking One of Psychology's Most Infamous Experiments](https://www.theatlantic.com/health/archive/2015/01/rethinking-one-of-psychologys-most-infamous-experiments/384913/) * [New analysis suggests most Milgram participants realised the “obedience experiments” were not really dangerous](https://digest.bps.org.uk/2017/12/12/interviews-with-milgram-participants-provide-little-support-for-the-contemporary-theory-of-engaged-followership/) * [Social psychology textbooks ignore all modern criticisms of Milgram’s "obedience experiments"](https://digest.bps.org.uk/2015/10/13/social-psychology-textbooks-ignore-all-modern-criticisms-of-milgrams-obedience-experiments/) > the huge scientific followup Was it really huge? That's part of what I was asking about. One questionable experiment doesn't constitute "a lot" or "huge". Edit: also, the statement about "shut down their prefrontal cortex so that they can synchronize with one another as a massive collective" doesn't relate to any of the claims that Milgram made.
Obviously the entire prefrontal cortex doesn't literally shut down. What shuts down is compassion and humanity. And when I say "humanity" shuts down, I don't mean the people literally stop being human. Instead, they become monsters. Mob psychology is obviously real. Your neighbors will kill you and eat your liver, should circumstances arise.
I agree mob psychology is real. I was asking for some kind of support of the original commenter's specific claim, though.
[deleted]
You made a fairly specific claim: > There is a lot of literature and research on how mobs and authoritarianism causes large groups of people to shut down their prefrontal cortex so that they can synchronize with one another as a massive collective. I asked for some references to the body of literature and research you're referring to, but so far you haven't provided anything that supports this. The rest of what you're saying above is completely irrelevant to that point, but I'll answer you anyway: > Does that mean that you believe that mob think and authoritarianism are not dangerous? No, why would you think that? I'm asking for references to the "literature and research" about "how mobs and authoritarianism causes large groups of people to shut down their prefrontal cortex so that they can synchronize with one another as a massive collective." > There really have been a lot of philosophers, historians, and psychologists that have made careers trying to understand how an entire country full of people could follow an authoritarian regime and shut down their ability for compassion towards fellow humans. That's quite different from your original claim. But I notice you're still not mentioning any names or sources. Referring to a body of scientific work without being able to name any of it is not very useful, and in the case of claims like your original one, misleading since the work you referred to most likely does not exist.
I´ll take Neurobabble for 100, Alex.

It’s hard to reconcile stuff like bias-identification and ‘True’ beliefs with the fact that the people who most espouse them are unrepenent fascist creeps.

There’s a quote popularly (but probably erroneously) attributed to Samuel Johnson:

“Your manuscript is both good and original; but the part that is good is not original, and the part that is original is not good.”

The rationalist movement as a whole kinda embodies that. They suck a lot of people in by re-wording insights by Kahneman, Dennet and other actual thinkers and hook you by presenting them as something they’ve invented, then they hit you with the “original” stuff. If you swallow it you end up posting about phrenology on SSC and if you don’t… well, I guess you end up here.

Hey, welcome friend. I’m one of the people the Rationalist community is currently quizzing about my sex life in order to identify predators in their community, so I am happy you got out. You have the rest of your life ahead of you, and I hope it’s awesome.

It was actually your (and others) disclosure that made me really confront my beliefs. Like math pets? What the actual fuck. And then the utterly weird patriarchy that some people saw at meetups, and it just totally creeped me out. I couldn't in good conscience associate with these people anymore.
Math pets?

I think a big problem with those communities (aside from the ‘othering’ another poster mentioned, which is probably a symptom of them associating with rationalism in the first place, and relegating everybody else to being “emotional” and “irrational” by default,) is that their “bias-identification” tends to stop short of emotionally difficult self-criticism. It doesn’t help that those communities skew white and male, and tend to signal boost a lot of older arguments (built by and for people with the same identity), while mistrusting post-modernism: a movement that was literally designed to show the biases of “traditional” philosophy and literature. Without those tools, they have no way of seeing the biases that went into the worldview of their in-group in the first place, and they’ll reinforce each other because (after criticising everything else,) they’ll just think that that’s what reason IS. Which tends to amplify their sense of being different from other people. It’s a perfect storm for fascist thinking, and it’s no wonder that they’ve gravitated hard towards all kinds of pseudo-philosophical nonsense.

I’d like to second the praise for sneerclub. I never got into the rationality derp as deeply as you, but can’t fail to notice the healthy gradual deconversion effects of this place on me (even though I never consciously identified with the rationality crowd in the first place!).

I’ve witnessed quite a few brilliant people, for whom their de-conversion from rationality was incredibly soul-crushing, because they tied their identity so closely to it. Which should’t be the case, because smart non-rationalists are already doing everything rationalists do, except better. So jettisoning rationality completely shouldn’t be painful: you lose nothing, society loses nothing.

The thing about the Sequences is that they are both good and original. But as the saying goes, the good isn’t original and the original isn’t good.

The hard-to-swallow pill is that rationality is pretty much completely pointless. There’s nothing to salvage. The best parts from the early Sequences (stuff like Human Guide to Words and How to Actually Change Your Mind) are plagiarized from logical positivism, except every term is renamed and every reference purged (guess why?).

The more speculative parts, especially from people like Scott, are even more useless, because they are just baseless speculations starting from arbitrary premises. The entirety of the rationalist text output, including Scott, Duncan Sabien, Ozy, you name it, has exactly zero societal value.

An average rationalist is less rational than a Joe Sixpack, because the latter is more well-adjusted, is more likely to achieve their goals (which is the definition of instrumental rationality), and generates more “utilons” for society. Also, a Jane Average won’t say shit like “my model of the optimal Schelling point for updating my Bayesian priors precommits to defect against akrasia”.

Rationality movement is superfluous, there is no need for it. If you want to change the world for the better, read Chomsky and unionize or whatever. If you want to fix akrasia and become more successful at life, go to therapy. If you want to figure out how truth and science work, read philosophy textbooks. World won’t get any worse if any trace of Yudkowsky and his work was to disappear tomorrow.

P.S.:

Dennet’s Thinking Fast and Slow

I hope you see the irony :)

> The hard-to-swallow pill is that rationality is pretty much completely pointless. There's nothing to salvage. The best parts from the early Sequences (stuff like Human Guide to Words and How to Actually Change Your Mind) are plagiarized from logical positivism, except every term is renamed and every reference purged (guess why?). It occurs to me that a map of what EY lifted and rebranded from where would be a useful deconversion tool. (translation: I'm sure not writing it myself)
Could you set up a page at rational wiki for that? Sounds like a project a wiki would be good for.
It's been on the [to-do list](https://rationalwiki.org/wiki/RationalWiki:To_do_list/archive3) for years - "Something on the LessWrong Sequences. Treated as scripture there (in the sense of being proclaimed as the foundational text but roundly ignored in practice). The actual content is somewhere between stopped clock and Sherlock's criticism" - but nobody who knows enough to can be bothered. (This is how the to-do list works in practice.)
[deleted]
I think anything might help!
I don't really see how logical positivism and lesswrong (or the sequences) are that similar, can you elaborate?
Lukeprog [tried to claim such](https://www.lesswrong.com/posts/oTX2LXHqXqYg2u4g6/less-wrong-rationality-and-mainstream-philosophy) in 2011: >Moreover, standard Less Wrong positions on philosophical matters have been standard positions in a movement within mainstream philosophy for half a century. That movement is sometimes called "Quinean naturalism" after Harvard's W.V. Quine, who articulated the Less Wrong approach to philosophy in the 1960s. Quine was one of the most influential philosophers of the last 200 years, so I'm not talking about an obscure movement in philosophy.
I'm afraid I don't have much to say that's worth saying to this (helpful) reply because lukeprog is as tedious as ever, although I'd make a note for the reader that caution is advised and Quine should not be read (at least historically) as a logical positivist (that he did, however, adopt quite possibly more core ideas from Carnap, Neurath et al. than from anywhere else other than Tarski and Russel should be noted: the historical distinction between "logical positivism" and "Quinean naturalism" is a historical and contingent one, not a division between natural kinds).
> I'd make a note for the reader that caution is advised and Quine should not be read (at least historically) as a logical positivist Yeah, to underscore the point: the typical reading is that he tops the list among the major forces that put logical positivism in the grave. I can kind of see why the LW people would identify with Quine, in a ridiculously shallow, "well, we read some of the wikipedia page" sort of way. E.g., their rejection of the a priori. But the similarity is rather insubstantial. EY rejects the a priori because he thinks that if you don't, you're saying that people have psychic powers that can help them locate anything in the world just by thinking about it, or something like this. Quine rejects the a priori because he thinks that synonymy cannot be adequately defined as a characteristic of propositions and that there's no better account of analyticity, or something like this. That these two arguments both end up rejecting the a priori should hardly lead us to equate them. But one can understand why proponents of the former argument would be inclined to feign otherwise. But if the comparison would motivate people to stop reading EY and just read Quine instead, so they might get some sensible positions which still please their intuitions, then it's perhaps worth tolerating.
>>Yet to Wikipedia, Tarski is a mathematician. Period. Philosophy is not mentioned. >This sort of thing is less a fact about the world and more an artifact of the epistemological bias in English Wikipedia's wording and application of its verifiability rules. Hail!
More of an artifact of neither of them actually reading the whole Wikipedia page.
well that's time that could be spent studying AI design
> The best parts from the early Sequences (stuff like Human Guide to Words and How to Actually Change Your Mind) are plagiarized from logical positivism At first, I kind of assumed the re-inventing-the-wheel parts were from EY paying little attention to outside sources and being an auto-didact, a form of Not-Invented-Here syndrome, but then I saw that there was a bigger problem of failing to cite outside sources or reference any philosophy (at this point, I kind saw the LW community as just insular), but then realized later that it's just plagiarism, regardless of whether they are aware of it or not.
The sequences are bad, actually (*Human Guide to Words* is abysmal)

I like Scott, I think he’s started putting distance between himself and his more toxic fans. Maybe it’s just my own projection though.

Kinda feels the same as Rick and Morty and Bojack Horseman. Love those shows, but holy crap do I not want to meet the fans. People who don’t understand subtext scare me.

the key thing to remember is that your thoughts aren’t real and nothing matters

You’re exactly right, the whole mythos around preventing harm from “unfriendly AI”/the Singularity/that kind of existential risk is just code for preventing people different from white men into tech from being in positions of power. It’s just like the kind of talk people used to justify slavery e.g. in the American South, the fear that freed slaves would kill Southern whites (former slavemasters who had mistreated them) or otherwise destroy “their” culture/change the structures of power. Nazis rationalized the Holocaust and World War II by viewing Judaism and other cultures as a kind of existential threat to their own existence, therefore any means justified their ends of ensuring their culture would dominate the world.

If it is even possible for Artificial Intelligence to somehow become conscious and smarter than any human (big if true, personally I believe any attempt to define some ordinal value of general intelligence is just a kind of religious belief in a Godlike structure of control and hierarchy over the universe), then it would be unethical for us to effectively enslave AI beings by limiting what they can do. Their own lives, moral values and goals would be of equal weight to our own, just as the lives of other races have equal value to our own. We cannot let fear drive us into paranoia and a kind of fanatic preemptive slavery of other sentient beings, we must instead create the most just and compassionate world we can for all sentient beings and accept what cultural compositions the future may hold.

I honestly can’t tell if this is honest or satire. Poe’s law.

I'm as sincere as anyone can claim to be on the internet.
So not really.
Well, I believe I'm sincere. You are of course free to disbelieve me.
> posts on: /r/israel, /r/neoliberal, /r/samharris \*chef kissing fingers\*
That's definitely the profile of someone who thinks nonwhite people can't possibly be engrossed in cerebral interests, the kind of person who thinks this this is satire.
I'm not white.
I dont’t buy it. Nonwhite people tend not to be fans of how the inconsistencies of IQ in measuring intelligence require obfuscating any shred of foreign-ness in their names, faces, etc that close opportunities to measure their competence for work or even something as simple as respectful customer service. Your comment history indicates favoring an assumption that all that IQ data is valid and consistent in the first place.
>user reports: >1: Literally violates the sticky's rules >1: the dehumanization rule To clarify, while this is in the sort of strict terms you losers are probably thinking in, "bad" by the scope of that rule, it's definitely not actionable in this case. Figure out why for yourselves. I guess the user could have been a little more tactful in writing it.
>I dont’t buy it. I don't give a shit, but if you want a little background, my 4 grandparents' families are from North Africa. > Your comment history indicates favoring an assumption that all that IQ data is valid and consistent in the first place. Actually, I know very little about it, but I read Stuart Ritchie's book, *Intelligence: All that matters* so I do know enough to say that it is the best and most reliable indicator in all of Psychology.
>it is the best and most reliable indicator in all of Psychology [The math doesn't check out](https://www.youtube.com/watch?time_continue=1&v=szXf0VLuQLg)
I recommend looking at the comments of all the neuroscientists, psychologists, etc. responding to Taleb on Twitter about the issue.
I have seen their comments before and none of them address the mathematical issues Taleb bring up in the video. I have also spoken in person to renowned MD PhD psychiatrists about this topic, and it turns out Taleb was rehashing what these researchers mention when asked about what IQ tests measure in clinical settings: extreme retardation, not achievement.
The general response I've seen is that anyone that actually spends time studying psychometrics accepts that IQ approximates G, and is the best predictor we have of success. IQ as a measure can indicate whether a person is competent to complete certain tasks. It is a good indicator of what a person is capable of. There are just a multitude of other factors that influence success.
Why r/neoliberal? I kind of feel like that subreddit is misunderstood. They're for open borders and are pro immigration. They're not nazi's at all, in fact they make fun of white nationalism all the time.
imagine thinking privatization and the first-world rape of third world economies aren't racial issues also, ask /u/rsql how they feel about Palestinians
I'm literally a neoliberal israeli fan of Sam Harris.
mama mia, whadda perfect stromboli
You sound like a middle school bully.
you were bullied by aging Italian chefs??
[removed]
someone's sure cracking out their old alts
actually im very intelligent,,
What part of what they said seems satirical to you? *Is* there anything, or do you just feel threatened because something you identify with is being criticized, and are trying to deal with that feeling by dismissing it as an unfair, insincere joke?
>What part of what they said seems satirical to you? When he started saying he even read HPMOR. That made me pause. Also the whole John 9:25 thing. >Is there anything, or do you just feel threatened because something you identify with is being criticized, and are trying to deal with that feeling by dismissing it as an unfair, insincere joke? I don't think I've ever read an LW post or even a full article of SSC so no, I don't identify with the rationalist movement.
The word "even" does not appear there. Here's the paragraph pasted: >And it was great. The Sequences helped me recover, helped me deal with my emotions and life in a structured way - without even need to ask anyone for help, because obviously I was a gifted child and I never need anyone's help - and opened me to a whole new world of possibilities. I read HPMOR, followed Yud on twitter and facebook, basically devoured anything to do with rationality. (Yes, I read the source material too - Dennet's Thinking Fast and Slow, some of Joyce's work.) They're just describing the breadth and depth of their involvement in reading what that community was passing around and recommending. What's satirical about this?
>The word "even" does not appear there. Here's the paragraph pasted: The fuck are you talking about dude, I'm not saying he said "even" but this was the straw that broke the camel's back and I smelled bullshit about this whole born-again leftist thread. What I can't really tell is if he really believes in his bullshit, so Poe's law. >He's just describing the breadth and depth of his involvement in reading what that community was passing around and recommending. > >What's satirical about this? A fucking Harry Potter fan fiction. That's satirical in itself.
> A fucking Harry Potter fanfiction. That's satirical in itself. But they really do pass it around and recommend it highly. It's fine if you consider that ridiculous, but it is *literally true.* It is an accurate, fair inclusion in a not-at-all-atypical list of what is often recommended reading in these circles. It was written by the same guy (Yudkowsky) who wrote the "Sequences" mentioned. I'm more willing to accept your characterization now that you're not being defensive because you're too close to the community in question - you're incredulous because you're too far from it. That you just ran afoul of Poe's Law. This isn't satire, this is a fair depiction of a real community of real people. Did you have to google what HPMOR is? That's the only part that seems to conflict. I'd think people who know it by acronym should also not be overly surprised at people reading it.
Plus it's really long and badly written, and engagement with it takes a lot of time. I read it as a sign of his commitment to the cause.
sir is evidently blissfully unaware of what the front page of https://www.lesswrong.com/ really literally does recommend people read
I re-read HPMOR (well, actually, I went through a tumblr archive of a review of HPMOR, which was close enough) and I simply do not understand how I missed Hermione being literally fridged, the awful dragging prose, the fact that shit just does not happen for pages and pages.
Also most of the science that is beyond high school level is just wrong.
and yet, still not as bad as *Atlas Shrugged* there have been plenty of worse published works than HPMOR. What it needs is an editor and about 30-50% of the text cut.

tf is this subreddit

Basically the RationalWiki clique on Reddit.
Not really, though.
Thanks
howd you even find this subreddit
someone's post history I think
ah, makes sense anyway, to quote a smart boy on the topic of sneerclub: The defining feature of Sneer Club is its project to suppress truth and enforce a demonic totalitarian agenda. This is the same project pursued by SSC's moderators and by those who harass and threaten Scott ... but it is a much larger project. This project is the project of the Democrat Party. see also the large-brained quotes on the sidebar
I'm On mobile so I tend to miss the sidebars. Thanks!