I’ll start. I used to be pretty into transhumanism, enough that I helped out with the second Singularity Summit. I stopped being friends with the dude who got me into it and I wasn’t into it nearly as much, but I was a dumb freshman at the time and still not really aware of how problematic transhumanism, the Singularity, etc. were. So I’d still glance at Overcoming Bias, Less Wrong, etc. over the years.
When I was a senior I finally started taking more hardcore AI classes, and between that and taking programming classes, I started becoming extremely skeptical that something as brittle as software could capture the essence of a general intelligence. Especially since most AI is the same bag of gradient-descent tricks. Philosophically speaking, I am a functionalist, and am open to intelligence being supported by a nonorganic substrate, but given how crappy people are at writing software, I just don’t think a silicon-based, Turing-machine-based intelligence is in the works.
Embarrassingly, I used to be one of those “what if there’s a link between race and IQ?” people. I remember reading The Mismeasure of Man and thinking that Gould was wrong because paleontologists were bad at math. If they were good at math, clearly they would have become mathematicians. I got over this by reading a lot more about the history of racism in America, the effects of poverty on IQ, what population geneticists said about the actual genetic diversity within different groups, and grew out of it. Funnily enough, even though Jared Diamond is pretty terrible in his own way, he was the one who got the ball rolling on me growing out of it. Somewhere in GGS he mentioned that Papua New Guinea alone contained 80% of all human variation. Having internalized that figure, it was pretty hard to keep believing that race and IQ were tightly linked in any meaningful way.
The final nail in the “rationalists are actually really dumb people” was the effective altruism movement. That was when I realized that the rationalists were fetishizing the act of quantification. You can’t just throw money at a problem, or even throw money at a ‘good’ charity and expect things to work. So what if you save a kid from catching malaria by sending them a net? Is the net still going to be good a year from now? What if a war breaks out? What if an Ebola outbreak happens? You can’t just sit there and feel smug, like you made a permanent difference. Life is not a static math problem.
edit: I do believe in giving to charity and being thoughtful about it. What I don’t believe is that you can use math to definitively prove that your charitable giving is superior. It’s very hard to quantify the positive effect of a donation, and the charity ratings that rationalists use are often based on much shakier science than they think they are. What I meant to show with my ill-chosen malaria example is that a rationalist will think to themselves, I donated 0 to a malaria net charity, ergo I saved 50 lives. I’m just saying the math does not work like that, due to other factors that you have no idea about, e.g. nearby war, people not even using the nets, the nets contributing to poisoning the water because the insecticide on them gets into a river, corrupt government, drought, etc. You could save a life from X but it’s possible all you’ll accomplish is getting them killed by Y instead. EA as practiced by rationalists is perhaps the first time the reification fallacy made an impression on me. That’s what I’m objecting to, not charity.
The stuff I prefer to give to, contrary to what an EA advocate would do, is not to overseas charities where I don’t know anything about the conditions there. (Though I do donate to a couple overseas charities that I carefully picked.) I prefer to give to worthy causes in my country and in my town so I know what’s actually happening with the donations, and this afaict is not something EA people would ever do.
It was SSC for a consent-loving feminist.
I read all of Eliezer and it didn’t alert me to much. But I was a huge Kahneman fangirl and had nobody to talk about it, so that was expected. I was also hanging around the Facebook rationalsphere, where I saw the first hints of something off, especially PUAs abundance and peculiar framings of sexuality. They also seemed to believe that “being able to say whatever you want without consequence as long as it’s polite helps solve complex problems” (No. It works to get an A in college when nobody else speaks out during the classes, but that is NOT how anything remotely complex actually gets done).
And then I stumbled upon some Hanson’s cuckoldry vs rape rhetoric and saw a bit more of SSC, including Untitled and Against Murderism. And that was it for me: the visceral feeling of discomfort (maybe even panic?) was too strong. It took SneerClub to help me realize that no, I do not have to seriously entertain the idea that popularizing the concept of consent is putting too much mental strain on nerd virgins that try so hard, don’t mean to actually hurt anybody and might save the world someday. And feeling queasy reading most of the rationalsphere commentariat does not make me an immoral and irrational person. So thanks <3. I was young, I doubted and believed, and I needed you.
I still mourn a little bit though. I mourn the feeling I had at the beginning: the audacity of thinking that yes, you can do good well and be flamboyant about it, and if you dedicate a week to Internet research and some journaling then you can uncover ways of optimizing for good nobody has thought of before, and you can create an AI and make everything instantly better… It was all very romantic, empowering and energizing. I was drawn to the rationalists’ anti-cynicism and “intellectual DIY” attitude.
But I’ve been detoxed, so. Back to the grind.
Also: SneerClub is much more playful than SSC, and Internet playfulness is important to me.
You fucking save a kid from catching malaria. That’s an unambiguous good.
There’s a lot to criticize about EA, but we can criticize it without being heartless bastards.
I was a prolific poster on the old LW, still consider myself rationalist-adjacent, and still consider the community an interesting source of ideas, but it got increasingly difficult to do so as (1) a group identity coalesced that felt it had to define, and defend, itself against outsiders and (2) this group identity, at least in places, demanded that you regard out-and-out racial supremacists as your friends as long as they spoke at a college level.
Scott’s comments about the Human Variation crowd are a perfect summa of this attitude - they’re precious and admirable merely because they use technical language and don’t immediately jump to culture war inflammation. I’m comfortable with the idea that we should restrict our arguments against ideas to the strongest ones, for our own epistemic health, and consider ideas that we might find uncomfortable, but this wholesale elevation of form over content is just perverse and leads to obvious evaporative cooling issues. And as others have said, at least in the SSC comments section, you can be quite passionate in your hatred of “SJWs” in a way that isn’t tolerated when blasted to your right.
This led to me quitting the SSC comments section and reading SSC itself mostly as hateread, even though I read many openly-much-further-right sources. I guess I couldn’t stand the residual thought that “oh, these are my people” while reading incredibly tedious comments about HR-enforced political correctness or whatever.
Conversely, I also think that the sneerclub crowd is a little overquick to dismiss ideas simply for being weird. I guess part of my alienation from the diaspora (or parts of it) has to do with the community’s original fascination with new and weird ideas getting steadily displaced by old, standard, demographically flattering ones about their own superiority and/or victimhood.
Many years ago, I was a friendly acquaintance of a guy who went on to become one of the top people in LW/MIRI for awhile, and I followed him in.
It seemed like a smart group of people having elevated conversations and trying to improve their thinking. I still am an enormous Kahneman fanboy, and it seemed to be in that spirit, although I always disliked Yudkowsky and never got that into it as anything more than one of a few regular check-in spots on the WWW.
After I’d sort of tuned out from LW, I stumbled on a few SSC pieces (“The Toxoplasmia of Rage,” “Meditations on Moloch”) that I thought at the time were pretty sharp and important to share. I came away with a relatively positive impression of SSC, although I noticed the comments section was a sewer. I also noticed that my friends never had much to say about the SSC or The Last Psychiatrist links I was sending them.
At the same time, I was going through a protracted personal crisis and befriended a couple of Social Justice-type people who helped me tremendously and at the same time inspired me to reckon with my own prejudices. I sorted through my unexamined reactionary and fearful beliefs and realized a lot of them no longer suited me.
A lot of stuff online that I had been interested in but uneasy about, I completely lost my tolerance for. It obviously had nothing useful to say about the real world I was living in.
Now it’s 2018, and I’m profoundly disgusted with conservatives in general and Trump apologetics in particular. I read “You Are Still Crying Wolf” and there are just so many holes in it, it’s clearly the work of someone who still desperately wants to believe that green-haired undergrads who are mean to him on Tumblr are what’s wrong with the world. Scott and the IDW are making a cottage industry out of waging the Culture Wars while there are real and desperately relevant problems in the world screaming for our attention.
I got into Chapo Trap House and had a lot of fun for a bit, taking the piss out of all these people I wasn’t sure why I used to respect. From there, I found Current Affairs, and eventually ended up here.
It feels good to laugh again.
For me it was something I inherited. Eleven (or thirteen, I forget) generations ago, my patrilineal relative, a captain in Washington’s army, sneered at the British in the Revolutionary War. And this tradition has been passed down to me from my forbears. This is why I also like revolutions. But it doesn’t explain why I like volcanoes.
I’m a social democrat who regularly reads The National Review, The American Conservative, various other conservative sites, etcetera. It’s probably bad for my mental well being, but at least I’m not surprised by right-wing arguments like some of my more sheltered friends are.
SSC was interesting, as a place with relatively smart centrist to center-right takes in the comments, but those quickly got flooded away by the skull measurers, free speech fanatics, and haters of any feminism beyond ‘OK, maybe they get to their own checking accounts.’
[deleted]
A brief summary of my time as an SSC reader:
1 month in: a lot of these people are real shitty but at least I learn something new half the time
3 months in: a lot of these people are real shitty but at least I learn something new a third of the time
6 months in: a lot of these people are real shitty but at least I learn something new a tenth of the time
1 year in: a lot of these people are real shitty and I keep seeing the same stupid shit posted over and over and I haven’t learned anything new in months
I don’t really fit any of the normal rationalist profiles (other than being a white guy). I only know about the Rationalosphere because I have an ex who is really deeply into it and has many opinions I find incomprehensible, so I decided to check out what’s going on with that. I’ve not been pleasantly surprised.
I got linked to HPMOR when it was trending as a ‘wacky new internet thing’ about Harry Potter being more scientific. That took me to lesswrong and I got into it for a few weeks or months until I figured out that EY’s rationalism jargon was his own made-up system and zero people outside his circles used Bayes’ Theorem the way he did and HPMOR was a transparent and clumsy repackaging of his ideas.
I didn’t really think about it for years except as ‘that weird rationalism cult that lured people in with Harry Potter fanfiction’. I’ve run into SSC posts for years and generally thought they made good points, but around 2016 I started reading it regularly and I got more and more irked at the sheer obtuseness and blindness to Scott’s biases (Crying Wolf was probably a big alarm bell).
I am actually still pretty rationalist-adjacent and occasionally post on lesswrong. But a lot of rationalist-stuff is very painful bullshit, and sneerclub is perfect for both pointing this out and laughing about it. Politically I’m pretty leftist (for Kraut standards, that’s probably outside the US Overton window).
At its best, the rationalists produce beautiful analogies that are perfect for me, e.g. “evaporative cooling of groups beliefs”. This analogy is only good for people who come from a technical background: If you don’t already have an intuition from physics, then you’re better off listening to people who know what they’re talking about.
But then there is the endless morass of self-help, cult-building and right-wing proselytizing (especially Scooter in the last year or so). And this is before you look at the comment section and begin to weep. I’m not invested enough in the community to make an effort to push back on this, so I sneer on the BS, sometimes join good conversations, and drop out once the signal-to-noise ratio gets below acceptable. Recently, I find myself commenting more on sneerclub than /r/slatestarcodex or lesswrong (which is imo much, much better than /r/slatestarcodex if you ignore 2/3 of the threads, which is easy to filter).
Regarding all this race-IQ stuff: I really don’t get it. Why do people care at all? Why did scooter have to put up this giant honeypot for racist idiots?
“hmm, this horrible abusive person I know is sharing links from a site called”less wrong”, probably doesn’t reflect on the site, eliminating biases is good!”
“Wait, what’s this”roko’s basilisk” thing”
“This yudkowsky dude is insanely overconfident and seems to actually be trying to undermine the scientific method to support overblown AI fears”
“This Scott Alexander guy seems pretty reasonable though”
“Wait nvm he’s actually a total prick”
“Welp, now the horrible abusive person is running the local lesswrong group.”
I stumbled across some SSC blog post from hacker news, probably, and found it interesting. So I did what I do with most interesting blogs I stumble across: read a few dozen things from the archives. Scott seemed insistent that I read something called “the sequences”, so I found a PDF copy and did that too over the course of a long vacation.
I remember both really liking the idea of rationalism and immediately connecting its ideas about overcoming bias to my existing anti-racism and anti-sexism skillsets. At the time the connection seemed so obvious to me that I assumed it must be a common interpretation.
Then I started reading the comments, then participating in the subreddit. It didn’t take me long to realize that the version of rationalism in my head bore only a faint resemblance to rationalism as it is practiced in the wild. That led me to vent my frustrations here, and I’ve been sneering ever sense.
I used to be pretty into SSC, but I got increasingly uncomfortable with some of the commenters it attracts and encourages. I realized I had three options:
Continue tacitly supporting (or at least, not opposing) reprehensible opinions while engaging with the rationality community in apolitical ways.
Argue against people more often, and become known as a virtue signaling SJW cuck. I tried this briefly, but I found that all of a sudden their principle of charity didn’t really apply to me any more.
Officially leave.
I ended up going with option 3, which left me with nowhere to go besides /r/sneerclub.
I don’t really belong here though. I’m still more or less the same type of person as most rationalists, just with more left wing views on most subjects. So I’m not meant to be a sneerer, and I’ll probably be moving on soon.
I donate to EA stuff, but wasn’t rationalist. After reading a few of the better LW/ SSC (mostly SSC) posts - Weirdtopia, Untitled, the Anti- FAQs, Toxoplasma, some of his stuff about basic income-
I thought maybe even if I wasn’t rationalist, heading to /r/ssc would be interesting, because at least it would be talking to people who read some of the same material I did.
And, it turns out, it was a little stranger than I expected. More cultural war from the right, less examination of structural/ social biases. Arguments just got circlejerked into oblivion, but sure, optimistic younger me thought. Of course these people are just genuinely looking to get their viewpoints challenged.
That optimism faded with the onset of screeching about liberals/ scientific racism, but I figured at least Scott was still a writer worth following.
Until I sort of realized how far right he was starting to go. He went from someone who explained left-wing things to right-wingers in a way they’d understand, and who’d explain right-wing things to left-wingers in a way that they’d understand, to someone who sort of just kept talking about how utterly terrible SJWs were and how Peterson was a misunderstood genius and Chesterton’s fence was the reason we shouldn’t change things and maybe race was tied to IQ -
I don’t really get why he went off the deep end recently. Did he sort of emathize with the types of young nerdy white guys who found Peterson as their new prophet? Has he surrounded himself with loony alt-right types online, so he doesn’t really understand what’s objectionable about them/ what’s good about the left, and he needs to pull back?
I don’t get it. I miss old Scott.
I was a teenage math nerd and reddit atheist who took a hard left turn into the humanities and gradually become convinced over the course of my PhD that the Marxists and the post-modernists were of much more value than I’d realised. I used to be much more willing to give rationalists the benefit of the doubt, since I’d started off as one of them, but Scott’s first post about Ashkenazi IQ tipped me over the edge. I didn’t mind them so long as they played by their own rules but it gradually become clear that the whole thing was just a cheap excuse to bash leftists, women and ethnic minorities, with actual philosophy a secondary consideration.
I disagree with you about EA, I think it’s pretty good (with some caveats obviously). I think the idea of doing the most good possible makes sense and saving people from malaria does more good than donating to a museum.
I never even knew they existed until I got on reddit like 2-3 years ago and the first rationalist adjacent group I saw was the rathiests. By that point I’d been through almost all of my undergrad philosophy curriculum and some upper level and graduate Phil Sci courses and couldn’t believe how uninformed some of he stuff I was hearing was. I had also done quite a bit of genetics/genomics/evolutionary biology research at that point and was equally unimpressed by most of the rationalist perspectives on those subjects too.
I’ve never really been a regular reader of SSC/LW/the rest, but I encountered them via doing a CS degree, reading Hacker News, bumping in to a rationalist or two, etc. I’m demographically similar to a lot of the people there, and had broadly similar interests as a teen etc. but was solidly left wing by the time I encountered those spaces. Having also read GGS as a teen, it may be bunk but as far as oversimplified models of human history go it’s probably one of the less toxic ones to absorb early on…
The first SSC post I solidly remember reading in full was the Meditations on Moloch one and I was made pretty uncomfortable by Scott’s willingness to entertain
fascismneoreaction as a reasonable set of ideas (strange that he’d want to edit that out…). At that point I was also aware of the whole AI apocalypse cult thing which didn’t exactly endear them to me.Then at some point around the beginning of the year I bumped into this sub via the link in the badphil sidebar and I like it here. I even occasionally sneak into SSC to try and give the less horrible people there some clues of what’s wrong.
I don’t see anything to be embarrassed about for asking the question, or even being taken in by the affirmative claims of Murray, etc. It just turns out to be a question we aren’t scientifically equipped to investigate sensibly, and embedded in a field which systematically over-represents its capacity to interrogate our interior lives.
I had been sneering at creeptocoiners for months when I saw D. Gerard, author of Attack of the 50 Foot Blockchain, post here. Now I’m here too, sneering at the cousins of the cryptoids.
Actually I’ve been a fully formed communist since birth.
I wasn’t aware of the Rationalist movement until I started seeing Rationalist blogs pop up in my area of Tumblr a year or so ago. Not only did they consistently write in the most exquisitely condescending tone, they also constantly locked lips with reactionary Christian traditionalist and various “identitarian”/Nazi blogs, and I quickly realized there was something bad about them.
Yud said, by denouncing his critics, that he was beyond criticism. I disagreed.
I have a very high opinion of the rationalists, above all the EAs. But two things drive me nuts:
The assumption that you can’t think rationally about morality (obviously this does not apply to the EAs). This assumption makes many rationalist default to an incredibly narrow contractualism, where only people in your community are morally considerable. It also means that any moral claims are interpreted as ‘boo’-ing.
The HBD debate displays all the usual failures of communication (mottes and baileys everywhere, enough dials to explain away anything, mostly unquantified, no appreciation that a big enough type-M error can become a type-S error, no particularly convincing causal story, unshakable conviction that all criticism is in bad faith). On some level this is to be expected - we don’t know enough about intelligence or genetics to articulate a good HBD story, much less test it. But then it shouldn’t be reshaping our ideas of social phenomena that we understand better, however incompletely.
When I was in undergrad, there was a transhumanist prof with a small personality cult, which is where I first heard about it. I started reading LW quite a bit as I was a psych student interested in cognitive biases and GEB at the time. After a while, I started to realize just how much of it was just repackaging basic ideas from cognitive psychology, decision theory, behavioral economics, etc. sprinkled with some original bullshit on top. I had way too many conversations that sounded like the Pigliucci/Big Yud debate.
I discovered LessWrong and it was so palpably stupid on its very face that I immediately hated it so much I couldn’t look away.
I considered myself part of the skeptic community for some years because I liked Skeptoid, SGU, P&T Bullshit, and a handful of YouTube channels on science and religion topics. (One of them was AronRa, who blessedly turned out not to be a human disaster like many other YouTube skeptics.) I was very much a “le STEM master race” guy and I deeply regret it, and I’ve moved away from those kinds of positions over time. My politics also moved a lot further left and that eroded the whole libertarian aspect that pervades a lot of those spaces. The responses to Elevatorgate and the full realization of the conservative underbelly of the Center for Inquiry were the final nails in the coffin for me.
I had brushed up against Big Yud and LessWrong in the same contexts as RationalWiki searches on occasion, but it was very much a surface level thing and I didn’t realize what was under the surface. I’d also heard about HPMOR and thought it was some cool aspects, but my inherent dislike of fanfic kept me from diving in. “Rationalist fiction” was another thing I’d been exposed to in limited amounts and found really weird.
A couple weeks ago someone darkened my Twitter timeline with a mocking screencap of Yudkowsky’s navel-gazing, about how most people didn’t have deep thoughts and weren’t working to save civilization. I searched some related terms on Reddit, put together some of the pieces, and found you folks.
(a cut’n’paste from my tumblr, hence the capital-free lowercase tumblr poetry)
i started reading because a friend was signing up for cryonics and was an active participant. (since i am for my dilettantism an actual expert on scientology, mutual friends literally deputised me to talk to him and see if he’d joined a weird cult. my verdict was “not really,” which remains my verdict.) my previous opinion of cryonics was neutral-to-positive, i looked into it and went “wtf is this shit.” the rationalwiki article on cryonics was the main result. i joined lesswrong in late 2010 ’cos it looked fun. went to a few of the meets. was put off attending one ever again by the vociferous support for scientific racism. apparently scientific racism is essential to being considered a true rationalist.
it took me years to realise there was no “there” there, that all the dangling references to references that refer to other references never resolve: that yudkowsky has literally never accomplished anything. he has literally no results to his credit, in his claimed field or out of it. he’s a good pop-science writer, and I highly respect that. i’ve read the sequences through a coupla times and there’s good stuff in there. and he’s written literally the most popular harry potter fan-fiction, for what that’s worth. but in his putative field, his actual achievements literally don’t exist.
and the papers! holy shit, these are terrible! TDT, CEV - i am not a scientist, but i have enough experience as a skeptic to tell when someone’s spent 150 pages handwaving away the obvious hard bit and playacting at science. the best thing to say about them is that none of them start “Hello, my name is Kent Hovind.”
i recently looked up my early comments and i’m amazed how optimistic i was that the weirdy bits could be dealt with by sweet reason rather than being the point of the exercise. “taking ideas seriously”, by which they don’t mean “consider this philosophical notion in detail in the abstract”, but “believe our utilitarian calculations and act on them because of our logic.” even scott alexander called this one out.
i went back and read every post in main from 2007-2011. you can see it getting stranger and stranger from 2009 on, as people took in and extrapolated the sequence memes. i would say that peak weird was the basilisk post in july 2010. this i think scared people and the weird seemed to noticeably scale back. they also started the regular meets, for slight interaction with reality.
i mean, i don’t claim a stack of achievements either, i don’t even have a degree, but at least i haven’t started a million-dollar-a-year charity to fund my fanfic and blogging.
Constantly seeing culture war threads wherein the most bizarro rightwing psuedoscience is given it’s 15 minutes but first day leftwing stuff (like the link between poverty and crime) is treated as an impossible conjecture. I like the general idea they try to promote on SSC, taking ideas and challenging them, steelmanning etc, but there are so many actors acting in bad faith that it’s impossible to really see stuff get a fair shake. There seems to be a lot of bias blindness on that subreddit.
What’s that about?
[deleted]
When I realised that the average rationalist was a smart pathetic looser.
They may be smart, but they are weak, and except for a weird quirk of society that only exists right now and will soon cease, they are failures.
The combination of the self congratulatory (“we are better for being pathetic”) , and the whining (“women should obey me and suck my dick whenever I want”), is pathetic beyond belief.
Such failure cannot be tolerated
Quotes above are not real quotes but my impressions of the average rationalists attitudes.
The only position they deserve in any society is that of the slave.
By the way, I may be disgusted by them, but I still believe race and IQ , two parent families are best, sexual liberation is only moral when birth control and std treatments are available, retirement is an antisocial evil concept, etc.
Reality belongs only to those who take it, and the typical rationalist could not take a sandwich out of a bag if their life depended upon it.