r/SneerClub archives
newest
bestest
longest
60

I’ll start. I used to be pretty into transhumanism, enough that I helped out with the second Singularity Summit. I stopped being friends with the dude who got me into it and I wasn’t into it nearly as much, but I was a dumb freshman at the time and still not really aware of how problematic transhumanism, the Singularity, etc. were. So I’d still glance at Overcoming Bias, Less Wrong, etc. over the years.

When I was a senior I finally started taking more hardcore AI classes, and between that and taking programming classes, I started becoming extremely skeptical that something as brittle as software could capture the essence of a general intelligence. Especially since most AI is the same bag of gradient-descent tricks. Philosophically speaking, I am a functionalist, and am open to intelligence being supported by a nonorganic substrate, but given how crappy people are at writing software, I just don’t think a silicon-based, Turing-machine-based intelligence is in the works.

Embarrassingly, I used to be one of those “what if there’s a link between race and IQ?” people. I remember reading The Mismeasure of Man and thinking that Gould was wrong because paleontologists were bad at math. If they were good at math, clearly they would have become mathematicians. I got over this by reading a lot more about the history of racism in America, the effects of poverty on IQ, what population geneticists said about the actual genetic diversity within different groups, and grew out of it. Funnily enough, even though Jared Diamond is pretty terrible in his own way, he was the one who got the ball rolling on me growing out of it. Somewhere in GGS he mentioned that Papua New Guinea alone contained 80% of all human variation. Having internalized that figure, it was pretty hard to keep believing that race and IQ were tightly linked in any meaningful way.

The final nail in the “rationalists are actually really dumb people” was the effective altruism movement. That was when I realized that the rationalists were fetishizing the act of quantification. You can’t just throw money at a problem, or even throw money at a ‘good’ charity and expect things to work. So what if you save a kid from catching malaria by sending them a net? Is the net still going to be good a year from now? What if a war breaks out? What if an Ebola outbreak happens? You can’t just sit there and feel smug, like you made a permanent difference. Life is not a static math problem.

edit: I do believe in giving to charity and being thoughtful about it. What I don’t believe is that you can use math to definitively prove that your charitable giving is superior. It’s very hard to quantify the positive effect of a donation, and the charity ratings that rationalists use are often based on much shakier science than they think they are. What I meant to show with my ill-chosen malaria example is that a rationalist will think to themselves, I donated 0 to a malaria net charity, ergo I saved 50 lives. I’m just saying the math does not work like that, due to other factors that you have no idea about, e.g. nearby war, people not even using the nets, the nets contributing to poisoning the water because the insecticide on them gets into a river, corrupt government, drought, etc. You could save a life from X but it’s possible all you’ll accomplish is getting them killed by Y instead. EA as practiced by rationalists is perhaps the first time the reification fallacy made an impression on me. That’s what I’m objecting to, not charity.

The stuff I prefer to give to, contrary to what an EA advocate would do, is not to overseas charities where I don’t know anything about the conditions there. (Though I do donate to a couple overseas charities that I carefully picked.) I prefer to give to worthy causes in my country and in my town so I know what’s actually happening with the donations, and this afaict is not something EA people would ever do.

It was SSC for a consent-loving feminist.

I read all of Eliezer and it didn’t alert me to much. But I was a huge Kahneman fangirl and had nobody to talk about it, so that was expected. I was also hanging around the Facebook rationalsphere, where I saw the first hints of something off, especially PUAs abundance and peculiar framings of sexuality. They also seemed to believe that “being able to say whatever you want without consequence as long as it’s polite helps solve complex problems” (No. It works to get an A in college when nobody else speaks out during the classes, but that is NOT how anything remotely complex actually gets done).

And then I stumbled upon some Hanson’s cuckoldry vs rape rhetoric and saw a bit more of SSC, including Untitled and Against Murderism. And that was it for me: the visceral feeling of discomfort (maybe even panic?) was too strong. It took SneerClub to help me realize that no, I do not have to seriously entertain the idea that popularizing the concept of consent is putting too much mental strain on nerd virgins that try so hard, don’t mean to actually hurt anybody and might save the world someday. And feeling queasy reading most of the rationalsphere commentariat does not make me an immoral and irrational person. So thanks <3. I was young, I doubted and believed, and I needed you.

I still mourn a little bit though. I mourn the feeling I had at the beginning: the audacity of thinking that yes, you can do good well and be flamboyant about it, and if you dedicate a week to Internet research and some journaling then you can uncover ways of optimizing for good nobody has thought of before, and you can create an AI and make everything instantly better… It was all very romantic, empowering and energizing. I was drawn to the rationalists’ anti-cynicism and “intellectual DIY” attitude.

But I’ve been detoxed, so. Back to the grind.

Also: SneerClub is much more playful than SSC, and Internet playfulness is important to me.

> It was all very romantic, empowering and energizing. > Also: SneerClub is much more playful than SSC, and Internet playfulness is important to me. Very well put there! I'm glad there's someone else who finds the overwhelming seriousness of that community off-putting.
Scott has a cutesy, McSweeney's-style gift for clever vignettes and wordplay. Yudkowsky is almost entirely humorless. (Unless his steaming resentment toward the mediocrities in his midst counts as "punching up.") The entire cult is extremely vulnerable to mockery, which is why finding this place was so gratifying.
/sigh you made me look at SSC again just so I could see if what you were saying was right. I tried to get more than a paragraph into Meditations on Moloch and couldn't.
You're not missing much. My point is that, while I find Scooter to be a clever writer compared to Yud, I've never LOLed or had much fun reading any of these people.

So what if you save a kid from catching malaria by sending them a net?

You fucking save a kid from catching malaria. That’s an unambiguous good.

There’s a lot to criticize about EA, but we can criticize it without being heartless bastards.

Oh no, that came off badly. The point I was trying to make is that it doesn't do much good to prevent someone from dying of malaria if something else in the environment is lethal. You'd need to fix both things at the same time. Look, I don't have a problem with giving to charity or trying to give effectively. I've volunteered in Africa, and I'd be willing to bet tons of money I volunteer more in general than most people on this sub. What I don't like is that EA types will research what charities to give to, but they won't consider things like, "maybe I should vote for politicians who don't want to invade the Middle East," or "maybe capitalism is the problem," or, "maybe the 'study' that I based my charitable giving on is flawed." And definitely not, "maybe I should politically organize so we can put politicians into office who will stop the Yemen crisis." Re: the malaria net thing, the nets have been used to fish instead of being given to children, and now that's causing overfishing in Lake Malawi and elsewhere. The nets are so thin and fine that very young fish will get caught up in them. The nets have additionally been treated with insecticide which is poisoning the lake. The insecticide in the nets is also selecting for insecticide-resistant mosquitoes. The nets are also annoying to sleep under (speaking from personal experience) and people often refuse to use nets. Some 50% of people will not sleep under a net. So even if you send a net, it might not get used and it might end up damaging the local ecosystems. It's not an unambiguous good, but I wrote it badly in my OP. What I meant is something like, an EA approach wherein you only look at "number of lives saved" ignores the context in which that life is, and that context could easily nullify what your charity just tried to do. Not to mention not fixing any of the structural problems that caused the need for charity in the first place. I give to charities all the time and try to be thoughtful about it, so in that narrow sense, sure, I believe in effective altruism. I've even donated to malaria net charities in the past. I just don't think the utilitarian method they use is nearly as good as they think it is. So I don't believe in Effective Altruism.
There's definitely a typical representation of form over content, engineering solutionism over sociology, and decontextualization over embeddedness that is so typical of the Rationalosphere attitude to social issues (the more liberal, not NRx, wing anyway). They love engineering and hate politics. Very tech liberal stuff.
I am pretty sure a lot of it comes down to them all being the kind of kids who never played with anyone else. I know this sounds like a sneer, but I'm trying to be descriptive. It's really easy to develop a decontextualized worldview when you only consume highly processed representations of the world in the form of computer programs, math, and STEM textbooks. And to be fair, a lot of people, even well-educated ones--or maybe, especially well-educated ones--are prone to this kind of content-free simplification.
> the nets have been used to fish instead of being given to children... EA approach wherein you only look at "number of lives saved" ignores the context in which that life is, and that context could easily nullify what your charity just tried to do. usually the people try to incorporate unintended consequences, incomplete/inconsistent use, etc. into their cost-effectiveness models, e.g. see [here](https://www.givewell.org/international/technical/programs/insecticide-treated-nets#Possiblenegativeoffsettingimpact). It's a good point to make if something subtle has been missed, and in general those writing these charity rec's are *decent* at listening to criticism and incorporating it into future analyses I'd agree that second-order-and-beyond effects of various interventions are hard to estimate and really explode the uncertainties, but it's not like the community is deaf to them, either, e.g. see all the discussion on the "poor meat eater" problem, or general overpopulation concerns, or whatever. I think the same predictive difficulties can be applied to most anything we do, but if we anticipate fairly symmetric distributions of uncertainty we can maybe round the expected value off to whatever the first order/direct effects are. Sure, if you ruin your shoes fishing the drowning kid out of the lake you might have just rescued future-Genghis Khan, but probably not, all things being equal
As I mentioned elsewhere, the rationalist community's problem isn't attempting to quantify uncertainty. It's putting way too much faith into the power of quantification. It's like the HBD stuff...it's not a bad thing that rationalists try to read science papers and draw unbiased conclusions, but it's a bad thing when they overestimate their ability to do so.

I was a prolific poster on the old LW, still consider myself rationalist-adjacent, and still consider the community an interesting source of ideas, but it got increasingly difficult to do so as (1) a group identity coalesced that felt it had to define, and defend, itself against outsiders and (2) this group identity, at least in places, demanded that you regard out-and-out racial supremacists as your friends as long as they spoke at a college level.

Scott’s comments about the Human Variation crowd are a perfect summa of this attitude - they’re precious and admirable merely because they use technical language and don’t immediately jump to culture war inflammation. I’m comfortable with the idea that we should restrict our arguments against ideas to the strongest ones, for our own epistemic health, and consider ideas that we might find uncomfortable, but this wholesale elevation of form over content is just perverse and leads to obvious evaporative cooling issues. And as others have said, at least in the SSC comments section, you can be quite passionate in your hatred of “SJWs” in a way that isn’t tolerated when blasted to your right.

This led to me quitting the SSC comments section and reading SSC itself mostly as hateread, even though I read many openly-much-further-right sources. I guess I couldn’t stand the residual thought that “oh, these are my people” while reading incredibly tedious comments about HR-enforced political correctness or whatever.

Conversely, I also think that the sneerclub crowd is a little overquick to dismiss ideas simply for being weird. I guess part of my alienation from the diaspora (or parts of it) has to do with the community’s original fascination with new and weird ideas getting steadily displaced by old, standard, demographically flattering ones about their own superiority and/or victimhood.

I suppose that's true. I do a bit much nerd-bashing myself on this sub. The older I've gotten though, the less I've come to see weirdness as something original and fascinating, and the more I think, "you're fundamentally disconnected from the rest of humanity." There's still good weirdness out there, but I don't see it in the rationalist community.
I do guarantee that a high percentage of people here are massive nerds themselves, to even know about this stuff. It's fun to discuss, if it wasn't for all the racism...
Yeah but there's nerds and then there's nerds, in the same way there's scientists and scientism.
I had a spit-take moment when a girl I was briefly sleeping with called me a "massive nerd" despite my self-perception as an incredibly charming promiscuous queer alcoholic who works out and hates sexy men as well as women, and then I realised how much time I spend on websites like reddit, shouting at Toby Young on Twitter, and talking about how boring liberals are
Yeah, I have literally hundreds of graphic novels, entirely too many games on Steam, can talk your ear off about both pro wrestling angles in the 1980's and various early 20th century political conventions, but ya' know, since I don't think the SJW's are scary, I'm not a true nerd.
Do you go so far as to fetishize science and math, or call "normies" "NPCs" though?
Nah, I was always a lame humanities guy caring about things like history. Unlike STEM folk, I've never argued my knowledge of 19th century Republican politics means I should be in charge of a jet propulsion lab, unlike the many STEM folks who seem to think we should outsource education to various Internet start ups.
See? You have humility, and, judging by your interests, a more complex relationship with humanity than "they're all irrational." That is why you are in this club and not that club.
I think so too. I am pretty nerdy by anyone's standards. But I don't really buy into Nerd Identity (which seems just as silly as Gamer Identity), and have virtually zero sympathy with social or political activism based on that identity.
At least Gamer has a distinct activity that you could plausibly arguable defines the identity. Nerd doesn't even have that!
> This led to me quitting the SSC comments section and reading SSC itself mostly as hateread, even though I read many openly-much-further-right sources. It should tell us something that the average commenter at *The American Conservative* is significantly to the left of the average commenter at The Other Subreddit.
>Conversely, I also think that the sneerclub crowd is a little overquick to dismiss ideas simply for being weird I think one problem I constantly encounter is that what you consider to be "weird" depends on what you think of history. A lot of the "weird" ideas stanned on SSC and elsewhere are actually very old, very basic ideas that are being rebranded for a new audience.

Many years ago, I was a friendly acquaintance of a guy who went on to become one of the top people in LW/MIRI for awhile, and I followed him in.

It seemed like a smart group of people having elevated conversations and trying to improve their thinking. I still am an enormous Kahneman fanboy, and it seemed to be in that spirit, although I always disliked Yudkowsky and never got that into it as anything more than one of a few regular check-in spots on the WWW.

After I’d sort of tuned out from LW, I stumbled on a few SSC pieces (“The Toxoplasmia of Rage,” “Meditations on Moloch”) that I thought at the time were pretty sharp and important to share. I came away with a relatively positive impression of SSC, although I noticed the comments section was a sewer. I also noticed that my friends never had much to say about the SSC or The Last Psychiatrist links I was sending them.

At the same time, I was going through a protracted personal crisis and befriended a couple of Social Justice-type people who helped me tremendously and at the same time inspired me to reckon with my own prejudices. I sorted through my unexamined reactionary and fearful beliefs and realized a lot of them no longer suited me.

A lot of stuff online that I had been interested in but uneasy about, I completely lost my tolerance for. It obviously had nothing useful to say about the real world I was living in.

Now it’s 2018, and I’m profoundly disgusted with conservatives in general and Trump apologetics in particular. I read “You Are Still Crying Wolf” and there are just so many holes in it, it’s clearly the work of someone who still desperately wants to believe that green-haired undergrads who are mean to him on Tumblr are what’s wrong with the world. Scott and the IDW are making a cottage industry out of waging the Culture Wars while there are real and desperately relevant problems in the world screaming for our attention.

I got into Chapo Trap House and had a lot of fun for a bit, taking the piss out of all these people I wasn’t sure why I used to respect. From there, I found Current Affairs, and eventually ended up here.

It feels good to laugh again.

Oh yeah, you used to read Ribbonfarm too, right? I wish there was a SneerClub for that.
Oh, yeah. I still don't consider it as toxic as LW or SSC - Venkat has some seemingly genuine humility about him - but it's of a piece. I've certainly irritated a few people by linking to it.
I think you're right about the humility. I think Venkat's other strength is that he doesn't shit on other modes of experience as a means of learning as much as the rationalists do. And while he's not funny, he at least appreciates humor and playfulness, even if he is ultimately still a terrible person. Plus, you can tell he got out of the house more than most people on LW/SSC.
How bad are Ribbonfarm comments? Scooter's are such an abyss that I generally ignore all of these people's now, but they're a decent barometer for who's reading and influencing the blogger.
I used to be in a fb group called Bay Area Refactorings. It was actually really good for commentary on tech stuff and all, but I had to leave after the election. It was all white dudes downplaying the role of racism and as one of the very few POCs in the group, I was getting really sick of being dogpiled by Sam Harris fans who insisted that talking about racism is what causes racism. I think RF groupies are less stuck in their heads than the majority of the rationalist crowd, but they're still really bad from a leftist perspective.
Why not expand SneerClub to cover it? Seems relevant enough.
What I meant is, I wish other people of the Sneer hated RF so I wouldn't have to do the work of hate-reading it and finding things to post. /r/SneerClub is a gift because people here hate read SSC for me, so I don't have to.
>I got into Chapo Trap House and had a lot of fun for a bit, taking the piss out of all these people I wasn't sure why I used to respect. From there, I found Current Affairs, and eventually ended up here See, I got into current affairs because of ssc. And from there into The Michael Brooks Show and Chapo and the Dead Pundits Society and a bunch of other left wing podcasts. So scc has actually made me more left wing and introduced me to more left wing media. In fact I only found scc because of Scott's anti reactionary faq. I was arguing with a reactionary at the time, and I was looking for a refutation of reactionary points and there it was. So ssc has been a pretty big net leftward influence on my thinking
I have some gratitude to Scott and to Sam Harris for turning me on to the work of the people they feud with. Engaging directly with their critics has worked out better for me than it has for them.
Chapo used to be a lot more fun but now the crowd there is deadly serious.
It went through a lot of different phases in a short period of time. It was dominated by obnoxious comedy-dork Cum Town fanboys, then ultra-PC kids, then violent tankies, and now I barely recognize it.
Chapo was a different game last summer. I guess a year or two ago the dirtbag left was new and different. But overtime it fell into the trap a lot of leftwing politics falls into, which is that after a critical mass the ratio of critics to optimists flips and then the fuckery enters
It was a great sub last summer, when new posters were coming in from different factions of the left, but most were still fans of the show and fluent in its humor. That sub and podcast were a big part of how I stayed sane through the Unite The Right shitshow.
Yeah same here man, Chapo was like one of the only things I had going for me last summer, it really sucks it's all fallen off. I guess that's part of the game though too, we get these little bursts of useful creativity and peace and then we head back into the struggle and use what we learned. Imo the only proper structure for a leftwing cultural organization is something like a dandelion. Anything more permanent and the rot sets in and undermines the whole project.

For me it was something I inherited. Eleven (or thirteen, I forget) generations ago, my patrilineal relative, a captain in Washington’s army, sneered at the British in the Revolutionary War. And this tradition has been passed down to me from my forbears. This is why I also like revolutions. But it doesn’t explain why I like volcanoes.

All traits are, after all, hereditary.
Except passion for volcanoes. That's new.
probably just a freak mutation

I’m a social democrat who regularly reads The National Review, The American Conservative, various other conservative sites, etcetera. It’s probably bad for my mental well being, but at least I’m not surprised by right-wing arguments like some of my more sheltered friends are.

SSC was interesting, as a place with relatively smart centrist to center-right takes in the comments, but those quickly got flooded away by the skull measurers, free speech fanatics, and haters of any feminism beyond ‘OK, maybe they get to their own checking accounts.’

Have you ever read Corey Robin's The Reactionary Mind? That clinched a lot of half-verbalized thoughts I had about conservatives. The whole mentality makes a lot more sense now, and I'm way less surprised by right-wingers.
I'm in a very similar place, although the reactionary comments don't discredit the space as a whole, for me. I mostly just ignore them, unless I'm looking for an entertaining argument at the moment.
American Conservative has a lot of good anti-war stuff, I read Daniel Larison regularly.
Have you found a replacement?

[deleted]

I agree with most of this, except for the 'sincere dedication' bit. Being in good faith is, and must be, a *defeasible* assumption; and I believe by this point Scott and much of his following have, in fact, defeated it. I didn't start out thinking that though, I concluded it after seeing the process you just described happen over and over.
[deleted]
That's fair, I'll accept that. It's interesting that that was in 2015; I notice a lot of his good stuff is from 2014-2015, and often seems to reflect on what he has become by this point. Rather tragic, really. No doubt a factor in my assessment is also that I share *zero* of Scott's grievances about social justice (whatever that is anyway), which probably biases me unduly against seeing him as at least trying good faith.
> when feminists say they feel like women aren’t being treated as people, I’m tempted to say something like “the worst you’ve ever been able to find is a single-digit pay gap which may or may not exist, and you’re going to turn that into people not thinking you’re human?” Jeeeeeeeeez
Can you believe that this guy's a psychiatrist? I feel so sorry for all his female patients.
>I'm not really a sneerer but I do hang around here and comment. I subscribe to a lot of SSC's popular ideas, up to a point. \(e.g. regarding the infamous train\-wreck debate involving bell curves, > >I think this comment speaks fairly for my views But this comment was upvoted and discussed seriously, How does it add up with the r/slatestarcodex isn't charitable towards leftists narrative that is being pushed here? Iv'e never seen high\-effort leftist arguments being downvoted for no reason.
[deleted]
Fair enough, but from my impression the overton window in SSC is very wide and the standards set for discussion (High effort, avoiding personal attacks) are generally pretty good. Are there any other more leftist inclined subreddits that have similar breadth of overton window (Where you will not be automatically downvoted for HBD or anti-feminism for example) and similar or better discussion standards? Because it just seems that sneerclub is more like a safe space for leftists who feel they are being outnumbered in the SSC community, while leftist communities usually treat conservatives much worse than how SSC treats left-wing people - So I find the complaints somewhat dishonest.
[deleted]
Thanks for the detailed replies. So basically my takeaway of what your'e saying is that Sneerers see Right\-wing people as evil and don't see any reason to have rational discussions with them: so their response is sneering, mocking and shaming in order to make the other side look low\-status instead of really engaging with the ideas. Honestly that's just depressing and as a centrist pretty much automatically pushes me to the right. I honestly feel like beyond everything it's bad tactics, when people see one side engaging rationally with ideas, and the other side using \[Current Year\] arguments and mocking instead of engaging with them \- The people who will be convinced by this and move to your side will be the people who respond well to these kind of tactics. Mostly people who are very status anxious or not very serious about how they construct their map. In the long term it's a losing strategy, and the IDW is a straight reaction to these tactics. more and more powerful people are just going to be crypto\-right wing until it will explode in the face of left. The rationalistic community deserves much better criticism than what this sub can provide.
[deleted]
I get the nuance but does it really make that big of a difference? The rationalistic community is one of the most opened minded and diverse communities there is and high effort posts leftists (like yours) can actually convince people and change their mind. So instead of trying this the Sneerers choose to create their own echo chamber in which they don't convince anyone but the already convinced and mock other people who actually trying to do things that (even misguidedly) think they are good or even having discussions. Hard not to come the conclusion that EY wrote,most of these people are made from the same material of r/milliondollarextreme posters, just from the other side of the political spectrum. SneerClub doesn't add anything valuable but laughs and giggles for leftist sociopaths (In this very own post OP mocked EA for saving kids from Malaria). From reading your comments it seems your'e in the wrong company.
> The rationalistic community is one of the most opened minded and diverse communities there is and high effort posts leftists (like yours) can actually convince people and change their mind. This is simply untrue, though. The longer you hang out here the more you'll hear the same story over and over again. Quoting **pipster818**'s post from higher up this thread: >I used to be pretty into SSC, but I got increasingly uncomfortable with some of the commenters it attracts and encourages. I realized I had three options: >1) Continue tacitly supporting (or at least, not opposing) reprehensible opinions while engaging with the rationality community in apolitical ways. >2) Argue against people more often, and become known as a virtue signaling SJW cuck. I tried this briefly, but I found that all of a sudden their principle of charity didn't really apply to me any more. >3) Officially leave. >I ended up going with option 3, which left me with nowhere to go besides /r/sneerclub. [Here's another good one,](https://www.reddit.com/r/SneerClub/comments/818b04/less_of_a_sneer_and_more_of_a_cathartic_rant/) here's the [top voted post of all time](https://www.reddit.com/r/SneerClub/comments/7gnzdb/is_it_the_people_or_the_philosophy/). They always say the same thing. It's by far the most common "origin story" for sneerers - we're mad because we feel betrayed. We believed you when you said you were charitable and open-minded, then you called us an SJW or a "virtue signaller" or a "conflict theorist". And we each eventually realized that when you talked about how important the "principle of charity" was, you meant that it was important for us to be charitable towards you. >SneerClub doesn't add anything valuable but laughs and giggles for leftist sociopaths Case in point.
There is not a single community that is perfect, but I challenge you link to a more open minded and more charitable community than the Rationalist community, and I'm not saying it to prove a point - I would love to participate in such community and maybe i'm just not aware of it. But from my experience most leftists communities just down-vote to hell everything that is even somewhat right of the center even if it's level-headed, rational and generally makes sense. The rationalist community does react negatively very common leftists idenititarian arguments a la "Your'e arguing it as a privileged white man". But that's not due to content but due to form. Same types of right-wing arguments will be downvoted to hell too ("Your'e saying it just because your'e a low iq black person"). I might be wrong though, but I didn't see any examples here of high-effort and logical leftists posts getting down-voted or attacked. For me it just seems that people are pissed that their usual ad-hominem leftist rhetoric are not accepted well. But honestly, I'm quite happy that this shit doesn't pass.
>I challenge you link to a more open minded and more charitable community >For me it just seems that people are pissed that their usual ad-hominem leftist rhetoric are not accepted well. But honestly, I'm quite happy that this shit doesn't pass. It cracks me up that you're posting on SneerClub right now and we're politely engaging in rational discussion with you (you haven't been downvoted or silenced or called racist or anything you complain about) and meanwhile your posts are equal part 1) boasts at how much better and more open minded and charitable you rationalists are, and 2) bitter sneers aimed at us dastardly leftist sociopaths who do nothing but throw around ad-hominems and rhetoric. You *perfectly* embody the Rationalist community.
the only reason it's not downvoted to hell is beacuse it's in a side thread on an already buried post and that I took a note to comment to someone who actually seemed reasonable and I thought can give me an honest perspective on this sub. Now you guys claim the rational community is psuedo charitable and open minded, I disagree with this and non of you willing to provide any backup to this claim by showing leftist communities that are better.
> Now you guys claim the rational community is psuedo charitable and open minded, I disagree with this and non of you willing to provide any backup to this claim by showing leftist communities that are better. (edit) You know, in retrospect; that post you're responding to works fine as a response to this one as well. So again; It cracks me up that you're posting on SneerClub right now and we're politely engaging in rational discussion with you (you haven't been downvoted or silenced or called racist or anything you complain about) and meanwhile your posts are equal part 1) boasts at how much better and more open minded and charitable you rationalists are, and 2) bitter sneers aimed at us dastardly leftist sociopaths who do nothing but throw around ad-hominems and rhetoric. ----------- (Also, quit trying to pivot the argument to the hypothetical charitableness level of left-wing communities just because you're losing this one. I'm not taking the bait)
There are multiple people reading this thread who could have downvoted you but didn't.
So you show up here to insult everyone, blithely assume that you are the ultimate rational jduge of what is "level-headed, rational, and generally makes sense", have no conception that people perhaps disagree on those criteria, and then get whiny that people won't instantly agree with you? You're a shite rationalist.
As someone who is participating in a subreddit whose sole purpose is to mock and insult members of another community I don't think you get to hold the moral high ground. For me: as they say "When in Rome, do what Romans do"
Actually that's not all it does; it also provides a place to discuss the failings of the Rationalosphere and how they could be overcome, and a refuge for those who have had enough of a bad crowd.
90% of the posts here are in the form of: "look what these ridiculous rationalists posting now". Calling the majority of the stuff here "discussion" would be an insult to real discussions. it's another r/milliondollarextreme or r/IncelTears or /r/beholdthemasterrace or like the late r/FatPeopleHate. Subreddits that intended to mock other communities. That's fine - your'e in a great company. No need to try to sugarcoat anything.
Like most Rationalists, you seem to lack every sense of proportion.
Never, never in a million years would it occur to you that 'charity' is actually subjective, would it. I find /r/sneerclub pretty congenial, and I wouldn't find /r/ssc so. Different folks like different things. It's unpleasant to be mocked, but if someone being mean to you online is the worst thing that's ever happened to you, you're very lucky.
I like how you extended me the principle of charity and characterized me as 'mocking EA for saving kids from malaria' when I went on to explain that I continue to donate to charities and even research the ones I give to thoroughly, and made it clear that that my beef with EA is that formulations of the "I gave $100 therefore I saved a life" kind are ridiculous. And by "like," I mean that I am amused that you whine about leftists not being charitable while you fail to show charity yourself. This is even more hilarious because I just volunteered in Tanzania last year.

A brief summary of my time as an SSC reader:

“This is a really interesting article, I wonder if the author has written anything else.”

“Hmm, not sure about this one; seems like the conclusion reflects the biases of the author more than anything.”

“Wow, that argument is incredibly unfair to feminists. Wasn’t he taking about how important charity is just last week?”

“Hold up a fucking second, is he really arguing it’s not racist to believe some races are inferior to others?”

“How on earth did I ever find any of this bullshit convincing?”

1 month in: a lot of these people are real shitty but at least I learn something new half the time

3 months in: a lot of these people are real shitty but at least I learn something new a third of the time

6 months in: a lot of these people are real shitty but at least I learn something new a tenth of the time

1 year in: a lot of these people are real shitty and I keep seeing the same stupid shit posted over and over and I haven’t learned anything new in months

I wrote a [screed](https://www.reddit.com/r/SneerClub/comments/7gnzdb/is_it_the_people_or_the_philosophy/) about my sneerxual awakening and this sounds pretty similar, except the "I learn something new" phase didn't last very long, while the "maybe I'm still having a productive dialogue with people who think differently?" phase lasted much longer than it should have. After the calamity of November 2016 I decided it was time to try kindness and understanding. But it took me way too long to realize (1) dialogue in r/SSC was only decreasing my respect and patience for the other side, which was the opposite of my goal, and (2) their weirdass cult doesn't represent any substantial portion of the American electorate, just the internet alt-right, and I now understand that subculture as well as I care to.
Hahaha, I don't think I've made it through more than one or two SSC posts. He's so verbose and whiny at the same time.
OH sorry this was my journey here from the subreddit. SA annoyed me the first time I saw him talk about the gray tribe.
I don't even want to look at the subreddit. SA's commenters are so much stupider than he is.

I don’t really fit any of the normal rationalist profiles (other than being a white guy). I only know about the Rationalosphere because I have an ex who is really deeply into it and has many opinions I find incomprehensible, so I decided to check out what’s going on with that. I’ve not been pleasantly surprised.

I got linked to HPMOR when it was trending as a ‘wacky new internet thing’ about Harry Potter being more scientific. That took me to lesswrong and I got into it for a few weeks or months until I figured out that EY’s rationalism jargon was his own made-up system and zero people outside his circles used Bayes’ Theorem the way he did and HPMOR was a transparent and clumsy repackaging of his ideas.

I didn’t really think about it for years except as ‘that weird rationalism cult that lured people in with Harry Potter fanfiction’. I’ve run into SSC posts for years and generally thought they made good points, but around 2016 I started reading it regularly and I got more and more irked at the sheer obtuseness and blindness to Scott’s biases (Crying Wolf was probably a big alarm bell).

> zero people outside his circles used Bayes' Theorem the way he did One of the weird things for me, as an actual scientist IRL (or pseudoscientific LARPer according to r/SSC and its total lack of irony), is that all these people go around calling themselves Bayesians while their approach to knowledge is "Hmmm. One side says racism still exists and the other says it's all an elaborate Marxist-feminist conspiracy to cuck us with soy products. Let's treat both sides as equally likely until we examine the evidence." In the same way these self-described utilitarians [magically become deontologists](https://thingofthings.wordpress.com/2017/09/23/deontologist-envy/) when the subject is freeze peach, they go frequentist about anything to do with ocialsay usticejay.
What kind of scientist are you, by the way?
I do genomics, so the HBD stuff is kinda surreal for me. It reminds me of the previous culture war, when I occasionally wasted time dealing with creationism.
Can you recommend any articles or books that do a good job of countering the HBD narrative?
[What Went Wrong: Reflections on Science by Observation and The Bell Curve](http://repository.cmu.edu/cgi/viewcontent.cgi?article=1302&context=philosophy) is solid. [Links to critiques of Genome-Wide Association Studies](https://en.wikipedia.org/wiki/Genome-wide_association_study#Limitations). (All the rage for demonstrating the link between genetics and IQ/scholastic achievement/wealth/etc. All those studies are complete garbage IMO, though.)
Seconded. Maybe it's been done here before-- links to someone else's recommendations are good too.
I find it weird the way 'Bayesians' boast about their predictions about random subjects. Do you expect me to be able to assess some online person's self-claimed level of successful predictions well enough to perceive their credibility in general? Even if I assumed they're honest and have a decent sample size of predictions, and could weight them by impressiveness, and evaluated which failures really updated their priors, I still have no way of measuring what predictions they *didn't* make. The field of 'predictions' outside a pre-selected area of expertise is infinite.

I am actually still pretty rationalist-adjacent and occasionally post on lesswrong. But a lot of rationalist-stuff is very painful bullshit, and sneerclub is perfect for both pointing this out and laughing about it. Politically I’m pretty leftist (for Kraut standards, that’s probably outside the US Overton window).

At its best, the rationalists produce beautiful analogies that are perfect for me, e.g. “evaporative cooling of groups beliefs”. This analogy is only good for people who come from a technical background: If you don’t already have an intuition from physics, then you’re better off listening to people who know what they’re talking about.

But then there is the endless morass of self-help, cult-building and right-wing proselytizing (especially Scooter in the last year or so). And this is before you look at the comment section and begin to weep. I’m not invested enough in the community to make an effort to push back on this, so I sneer on the BS, sometimes join good conversations, and drop out once the signal-to-noise ratio gets below acceptable. Recently, I find myself commenting more on sneerclub than /r/slatestarcodex or lesswrong (which is imo much, much better than /r/slatestarcodex if you ignore 2/3 of the threads, which is easy to filter).

Regarding all this race-IQ stuff: I really don’t get it. Why do people care at all? Why did scooter have to put up this giant honeypot for racist idiots?

I agree actually that LW generally seems to have higher standards than r/SSC, when they're not talking about any Bay Area Cult stuff. Generally I think the order is LW > SSC > r/SSC.
> r/SSC fucking /ratanon/ is more readable than r/ SSC

“hmm, this horrible abusive person I know is sharing links from a site called”less wrong”, probably doesn’t reflect on the site, eliminating biases is good!”

“Wait, what’s this”roko’s basilisk” thing”

“This yudkowsky dude is insanely overconfident and seems to actually be trying to undermine the scientific method to support overblown AI fears”

“This Scott Alexander guy seems pretty reasonable though”

“Wait nvm he’s actually a total prick”

“Welp, now the horrible abusive person is running the local lesswrong group.”

I stumbled across some SSC blog post from hacker news, probably, and found it interesting. So I did what I do with most interesting blogs I stumble across: read a few dozen things from the archives. Scott seemed insistent that I read something called “the sequences”, so I found a PDF copy and did that too over the course of a long vacation.

I remember both really liking the idea of rationalism and immediately connecting its ideas about overcoming bias to my existing anti-racism and anti-sexism skillsets. At the time the connection seemed so obvious to me that I assumed it must be a common interpretation.

Then I started reading the comments, then participating in the subreddit. It didn’t take me long to realize that the version of rationalism in my head bore only a faint resemblance to rationalism as it is practiced in the wild. That led me to vent my frustrations here, and I’ve been sneering ever sense.

> immediately connecting its ideas about overcoming bias to my existing anti-racism and anti-sexism skillsets. ahahahahaaaaa /u/_vec_, you're a good soul.
I honestly thought for a long time, even after being linked to it more than once, that "Overcoming Bias" was a blog about racism/sexism/LGBTphobia/etc. because of the title. \(Which of course is a totally appropriate concern for people who want to be rational, what with getting over your instinctive cognitive errors and all that, but being rational seems pretty distinct from being Rationalist.\) Lol me.
Intersectional feminism and the like have *loads* of material that's *totally* relevant to working toward their stated objectives. The fact that all of it is mysteriously taboo for them, while Steve Sailer and TakiMag are considered fair and objective sources of useful information, is just maddening.
It's almost like that's the primary goal of gender studies.
You mean it's not about ostracizing socially awkward center-right nerds and ultimately throwing them in supermax prisons because they're bad at flirting? Bummer.
If they believed their own BS about evolution and evolutionary psychology, they should lie down and accept that mother nature has not deemed them worthy of reproduction.

I used to be pretty into SSC, but I got increasingly uncomfortable with some of the commenters it attracts and encourages. I realized I had three options:

  1. Continue tacitly supporting (or at least, not opposing) reprehensible opinions while engaging with the rationality community in apolitical ways.

  2. Argue against people more often, and become known as a virtue signaling SJW cuck. I tried this briefly, but I found that all of a sudden their principle of charity didn’t really apply to me any more.

  3. Officially leave.

I ended up going with option 3, which left me with nowhere to go besides /r/sneerclub.

I don’t really belong here though. I’m still more or less the same type of person as most rationalists, just with more left wing views on most subjects. So I’m not meant to be a sneerer, and I’ll probably be moving on soon.

> I'm still more or less the same type of person as most rationalists, just with more left wing views on most subjects. Isn't that most people here?
I don't even think I'm very left\-wing. Despite everything I still want to believe stuff like capitalism and free speech and being "charitable" might actually work, in most situations, if they weren't rigged to benefit the powerful few who disingenuously weaponize those terms to keep the system unfair. My more politically passionate friends call me a fucking centrist, or even a liberal \(!\). But I guess by white male anglophone millenial programmer standards that puts me just to the left of Rosa Luxemburg so idk. It's contextual.
I dunno about capitalism. But for the more process-related stuff, it's why I always distinguish rationalism from online Rationalism. Cause what they profess and what the internet community does are really different things, almost unrelated. And I think there is merit in the original process-oriented ideas (I guess they'd call it "meta level"), if reoriented away from the community attitudes and shibboleths, and its powers used for good. Honestly, as a pretty dedicated leftist, I think a bit more rationalism (but not Rationalism) would be quite healthy for the left.
I'm not sure how capitalism would work without capital accumulation. Capitalism creates and maintains a stratum of powerful people, and the powerful in turn structure capitalist systems to benefit them. That's not "rigging" - that's how the system works.
I can't really get a solid grasp on what rationalists think besides the reactionary positions so I don't know how much I overlap with them to be honest. But considering that I'm pretty anti-realist leaning and sympathetic to a lot of STS and Phil Sci I'm guessing I don't overlap with them much at all.
Well, me too. But I meant rationalism with a lower case. I think the awareness of bias, and the striving to get better at overcoming it, is valuable and interesting. Learning stuff about the operations of reasoning and its fallibility is worthwhile. And it can be used to reflect on oneself and on things one thought to be known data. That's what I interpret rationalism as such to be about, aside from the Rationalosphere online.
Ah, I see. I definitely see the value in that then.
My opposition is partly aesthetic. They have a static view of the world that ultimately tolerates no ambiguity and that's just...boring.
I'd like a place for rationalist SJW cucks.
>a left-wing rationalist Aw man, you're like a unicorn. Any insight on why so many of your fellow rationalists have gone right-wing? What kept you from agreeing with them?
I still can't tell if they were always right wing and it was just hard to tell, or if there's slowly been a shift from the so-called 'gray tribe' slowly descending into red.
A little of both, I think. It's always been a bit small-c conservative but it doesn't seem like it got out of hand until the trolls managed to frame believing in racist pseudoscience as a test of your intellectual integrity.
Plus, Trump is crack rock for reflexive contrarians.
I think SSC appeals to a lot of people from left wing backgrounds who have a contrarian streak and/or a victim mentality. Then it started slowly spiraling into a more extreme version of that. I think there's also a dangerous tendency on the right to see the left as a monolith, and to assume that the most extreme members of the left speak for the majority, which leads to undue paranoia and increased radicalization. (The left certainly does this to the right also, but it's important to realize that it happens in both directions.) I guess I just get tired of any community that tells me I have to act a certain way or agree with certain opinions. Unfortunately I get that in sneer club too. For example, I actually think that by fanfiction standards, Yudkowsky is quite a good writer, but that's contrary to the purpose of this sub.
>I actually think that by fanfiction standards, Yudkowsky is quite a good writer, but that's contrary to the purpose of this sub. Yeah, credit where it's due; both he and Scott are actually not bad at fiction writing. Scott struggles with character and he'd be constantly told to "show, don't tell" in a Creative Writing course (don't have your character say "I'm thinking about killing you", have his eyes linger on the knife) and Yudkowsky struggles with dialogue (either that or everyone is supposed to be a Vulcan), but I've read worse. Scott's really good at structuring plots so the ending has a satisfying payoff, which is a skill plenty of published authors lack. In both cases their writing quality plummets when they get to the intended "message" of their work, though - the story stops to make room for the Ayn Rand style monologues about the importance of whatever bullshit they're trying to push. And both are far better at the short story than they are at the novel (Scott's lack of ability to write believable characters makes Unsong an impossible slog).
Scott has written fiction?
[Yup, quite a bit.](http://slatestarcodex.com/tag/fiction/) He's even written a [novel](http://unsongbook.com/). The shorter ones are OK, but the longer stories tend to drag due to flat characterization. Also; you need a high tolerance for 1) shitty puns, and 2) an author who is really, really impressed with himself for coming up with shitty puns.
I skimmed the first few paragraphs of the novel. It wouldn't be bad if Scott just stuck to writing world-building guides for tabletop RPGs.
Well, by fanfiction standards, sure, Yudkowsky's not bad, but that's a low bar. Anyway, the unfortunate thing about any community is that it's going to develop standards. I got banned from /r/latestagecapitalism for saying that maybe if the world had been less anti-semitic and actually accepted Jewish refugees before and during WWII, Israel wouldn't exist now. It's all infuriating, but I guess the only thing you can do is find the community that irritates you the least.
I read about 23 or so chapters of HPMoR. I actually found it way better written than I expected, if still incredibly boring and obnoxious in content. I think the Sequences are garbage though, even rhetorically. I'll admit though that I also strongly dislike Yudkowsky at a personal level so keep that in mind.
Let me know where you moved to - I'm at the stage where I just left /ssc and just discovered /sneerclub, and I can see myself leaving here eventually as well.
I mainly hang out at /r/pipreddit lately, but I don't think that's a common trajectory. The thing is, I think subreddits and communities like this tend to develop either around or against notable people. Yudkowky went out and wrote the Sequences and HPMOR, and depending on how much people like his writing, they either end up at subs like /r/lesswrong or /r/rational or /r/slatestarcodex (if they like it) or subs like /r/sneerclub (if they don't). People like me don't really fit into these categories. And because so far no one in the left-rationalist or post-rationalist category has attained the stature of Scott or Eliezer, I don't think there's much basis for a strong community to form yet. You've also got to wonder whether it's even a good idea to spend your time in a community built around a certain person or certain concept. It's possible that it's best to just do the hard work of being an individual.

I donate to EA stuff, but wasn’t rationalist. After reading a few of the better LW/ SSC (mostly SSC) posts - Weirdtopia, Untitled, the Anti- FAQs, Toxoplasma, some of his stuff about basic income-

I thought maybe even if I wasn’t rationalist, heading to /r/ssc would be interesting, because at least it would be talking to people who read some of the same material I did.

And, it turns out, it was a little stranger than I expected. More cultural war from the right, less examination of structural/ social biases. Arguments just got circlejerked into oblivion, but sure, optimistic younger me thought. Of course these people are just genuinely looking to get their viewpoints challenged.

That optimism faded with the onset of screeching about liberals/ scientific racism, but I figured at least Scott was still a writer worth following.

Until I sort of realized how far right he was starting to go. He went from someone who explained left-wing things to right-wingers in a way they’d understand, and who’d explain right-wing things to left-wingers in a way that they’d understand, to someone who sort of just kept talking about how utterly terrible SJWs were and how Peterson was a misunderstood genius and Chesterton’s fence was the reason we shouldn’t change things and maybe race was tied to IQ -

I don’t really get why he went off the deep end recently. Did he sort of emathize with the types of young nerdy white guys who found Peterson as their new prophet? Has he surrounded himself with loony alt-right types online, so he doesn’t really understand what’s objectionable about them/ what’s good about the left, and he needs to pull back?

I don’t get it. I miss old Scott.

I think if you're the kind of person who dismisses structural/social issues, when you see #metoo/BLM getting attention, it _has_ to be because they are fundamentally wrong. And ever since the election, civil rights issues have come up in a way that they haven't before. What did they say, in the most recent SW movie...darkness rises, and light rises to meet it.
Nerd politics is bad politics.
Scott was always bad, he's just more confident in his badness now.

I was a teenage math nerd and reddit atheist who took a hard left turn into the humanities and gradually become convinced over the course of my PhD that the Marxists and the post-modernists were of much more value than I’d realised. I used to be much more willing to give rationalists the benefit of the doubt, since I’d started off as one of them, but Scott’s first post about Ashkenazi IQ tipped me over the edge. I didn’t mind them so long as they played by their own rules but it gradually become clear that the whole thing was just a cheap excuse to bash leftists, women and ethnic minorities, with actual philosophy a secondary consideration.

I disagree with you about EA, I think it’s pretty good (with some caveats obviously). I think the idea of doing the most good possible makes sense and saving people from malaria does more good than donating to a museum.

I agree that being mindful of how you do good is important and that some ways of doing it are better than others. I even agree that numbers can be useful in helping to tell which forms of charity are better than others. I disagree very strongly that the EA formula of optimizing for lives saved per dollar spent is the best way to do it. It wouldn't occur to an EA practitioner to consider whether capitalism is the problem, or even whether American imperialist ambition is a problem. They completely ignore the political and structural dimensions. Like, what would the EA approach to helping Puero Rico be? Finding the most effective charities there and donating. Whereas, in the long-term, helping them achieve statehood would be better so they're never neglected by the feds like this again. And they don't consider secondary effects of the charities themselves. Like with the malaria nets--people use them to fish instead of hanging them up to protect themselves from mosquitoes. This does two things: 1. since the nets are often treated with insecticide, drinking water now gets contaminated. 2. free nets == overfishing == eventual economic distress. One of the other darlings of the EA folks is the de-worming of children. The idea is that parasitical worms in the gut will retard child development, so giving children medicine to rid them of worms should improve their growth and school performance. The original study that came to this conclusion has not been replicated, but in their typical fake-scientist way, EA people will keep donating to the de-worming charities. Doing the max good just isn't easy, and EA is offensively glib about it. Reality is hard to quantify and pin down, and they're bad at recognizing how that applies to charity. It's hard to figure out _what_ the maximally good action is, and it's hard to make sure that it won't have negative tradeoffs. So, to be completely clear--people should try to give as well as possible, but people should also recognize that improving the world is not as simple as applying a mathematical formula.
Eh, the world is very uncertain but that's not a good reason to not even *try* to quantify. The sort of analysis that Givewell does should be a major consideration, even if it's not the only consideration. They are also well-informed enough consider crowding-out effects (e.g. if we donate bednets, will the government/Gates cut back on bednets) which is totally beyond a normal individual. It's possible that EA as usually practiced doesn't scale very well. Sheldon Adelson possibly could get statehood for Puerto Rico. But we couldn't, even collectively. For us, donating to the most effective-seeming currently existing charity probably is the best move.
Like I say elsewhere, I'm not opposed to trying to quantify the effects of charitable giving. I agree that it's good to take a look at GiveWell and CharityNavigator. I've done that and will continue to do that. What I'm opposed to is the overwhelming confidence EA'ers have in the power of quantification. For the fiftieth goddamn time, it's the lack of humility about their numbers and ability to understand a situation that I find terrible about EA lovers. My preferred solution is to only donate to causes that I am well-informed about. I do not believe that it is within the grasp of most people in one country to be sufficiently well-informed about the affairs of a different country to donate effectively. > But we couldn't, even collectively. Boy, it sure is a good thing Susan B. Anthony, Margaret Sanger, Toussaint L'ouverture, MLK and Malcolm X, and Nelson Mandela never listened to you. I'm going to tell everyone I've ever canvassed with and phonebanked for, that we should stop trying to get sensible politicians elected and just donate to some water charity in Tanzania. > crowding-out effects (e.g. if we donate bednets, will the government/Gates cut back on bednets) > But we couldn't, even collectively. So which is it, either that you are capable of collective action or you are not?
It reminds me of individualist environmentalism.
Yes exactly!
eh, the ideal is fine but they have a bad track record of donating millions of dollars to incestuous meta\-charities and "AI ethics" groups
What a coinkydink that their utilitarian calculations lead them to supporting their computer hobby.
All charities are like that. I am not an EA person. I give money to one charity and that is the one which gave me a scholarship to college. Wholly irrational, but a well run organization. Plus I get invited to outings at fancy golf courses.
I think EA makes sense for things that are commensurable. I don't think you can meaningfully compare donating to malaria nets with donating to culture. I do think you can meaningfully compare the added value of X amount of further resources put into Y type of healthcare compared to Z type of healthcare.
The problem is that nothing is as commensurable/quantifiable as rationalists seem to think it is. I do agree with you otherwise.
When (if?) I finally start my left-oriented meta-rationalist site (which I will promptly abandon out of my extreme laziness), I definitely intend to write something on the way a lot of engineering/tech 'solutionism' does not understand what commensurability is and how this makes them constantly propose simple and irrelevant solutions and false utilitarian calculations.
I'll be there whenever (if?) you start it. In the meantime, are there any books or blogs that you recommend? My distaste for the rationalists has gradually grown to encompass STEM people in general, and has even started to encompass analytical people in general. It wants feeding.
Well, it's one reason I was thinking of making one, hopefully something collaborative/multi-author. I think there's a bit of a gap in the internet discourse 'market' (how I hate that analogy!) for it.
Well, let me know if you start it. I'd be happy to contribute something about EA, obvs.

I never even knew they existed until I got on reddit like 2-3 years ago and the first rationalist adjacent group I saw was the rathiests. By that point I’d been through almost all of my undergrad philosophy curriculum and some upper level and graduate Phil Sci courses and couldn’t believe how uninformed some of he stuff I was hearing was. I had also done quite a bit of genetics/genomics/evolutionary biology research at that point and was equally unimpressed by most of the rationalist perspectives on those subjects too.

Are you in academia right now? It's great when actual scientists post critiques of scientist BS.
Yeah, I'm a PhD student in evolutionary biology studying plant genome evolution. I've dabbled in population and quantitative genetics a fair bit too. If you like that, you'll really enjoy [this recent Science interview](http://www.sciencemag.org/news/2018/05/it-s-toxic-place-how-online-world-white-nationalists-distorts-population-genetics) with a PhD student at UMichigan about alt-right misuses of population genetics data to support stuff like HBD. There's a similar story that should come out soon in the NYT science section that I've been talking with a reporter about too.
Amazing! you'll have to post that NYT story once it comes out.
I wish that article were a little deeper, going more into why they're wrong. As it stands it's mostly praising how much they know about it :/
Yeah, the Atlantic article was better. I have high hopes for the NYT article too but it's been taking a bit to finally publish
Hark, fellow biologist!
This is close to my experience, where I discovered reddit, badphilosophy, and sneerclub through an initial antipathy for Sam Harris. The interesting difference between and many of my fellow sneerers is that I never saw any reason to take the whole thing seriously in the first place.

I’ve never really been a regular reader of SSC/LW/the rest, but I encountered them via doing a CS degree, reading Hacker News, bumping in to a rationalist or two, etc. I’m demographically similar to a lot of the people there, and had broadly similar interests as a teen etc. but was solidly left wing by the time I encountered those spaces. Having also read GGS as a teen, it may be bunk but as far as oversimplified models of human history go it’s probably one of the less toxic ones to absorb early on…

The first SSC post I solidly remember reading in full was the Meditations on Moloch one and I was made pretty uncomfortable by Scott’s willingness to entertain fascism neoreaction as a reasonable set of ideas (strange that he’d want to edit that out…). At that point I was also aware of the whole AI apocalypse cult thing which didn’t exactly endear them to me.

Then at some point around the beginning of the year I bumped into this sub via the link in the badphil sidebar and I like it here. I even occasionally sneak into SSC to try and give the less horrible people there some clues of what’s wrong.

Embarrassingly, I used to be one of those “what if there’s a link between race and IQ?” people.

I don’t see anything to be embarrassed about for asking the question, or even being taken in by the affirmative claims of Murray, etc. It just turns out to be a question we aren’t scientifically equipped to investigate sensibly, and embedded in a field which systematically over-represents its capacity to interrogate our interior lives.

It's an unusually dangerous question, though. Not in the "forbidden knowledge" or "social stigma" sense, but in the "your brain may process this in a maladaptive way" sense. People have been wrestling with essentialist explanations for structural inequalities for centuries. Most of those people probably thought of themselves as smart, careful, honest intellectuals pursuing truth and understanding. They nevertheless consistently found ways to confirm their own biases. Most of their mistakes are obvious to modern eyes, but not because we are any wiser than they were. We merely have the blessing of hindsight to aid us. Or intuition here seems to be very misleading. Our subconscious biases seem unusually pernicious. Our personal experience is rarely representative. Our naive mental models are worse than usual. I'm not an expert in genetics, but I do recognize how woefully unprepared I am to approach this topic with even a fraction of the rigor that it obviously demands. Based on the experiences of people far better equipped, I'd be much more likely to end up confused and mistaken than to learn anything about the world. Rationally, the smartest course of action is to accept my own limitations, back away slowly, and let the experts sort it out.
Even the best geneticists in the world cannot yet approach this topic with a fraction of the rigor that it obviously demands. It's beyond our scientific capabilities for now, and will be for some time. It's probably more productive to point that out to people than to suggest that as ignorant, cryptic racists they're at risk of coming to the wrong conclusion if they dare to consider the question.

I had been sneering at creeptocoiners for months when I saw D. Gerard, author of Attack of the 50 Foot Blockchain, post here. Now I’m here too, sneering at the cousins of the cryptoids.

Actually I’ve been a fully formed communist since birth.

We know.

I wasn’t aware of the Rationalist movement until I started seeing Rationalist blogs pop up in my area of Tumblr a year or so ago. Not only did they consistently write in the most exquisitely condescending tone, they also constantly locked lips with reactionary Christian traditionalist and various “identitarian”/Nazi blogs, and I quickly realized there was something bad about them.

Yud said, by denouncing his critics, that he was beyond criticism. I disagreed.

I have a very high opinion of the rationalists, above all the EAs. But two things drive me nuts:

  1. The assumption that you can’t think rationally about morality (obviously this does not apply to the EAs). This assumption makes many rationalist default to an incredibly narrow contractualism, where only people in your community are morally considerable. It also means that any moral claims are interpreted as ‘boo’-ing.

  2. The HBD debate displays all the usual failures of communication (mottes and baileys everywhere, enough dials to explain away anything, mostly unquantified, no appreciation that a big enough type-M error can become a type-S error, no particularly convincing causal story, unshakable conviction that all criticism is in bad faith). On some level this is to be expected - we don’t know enough about intelligence or genetics to articulate a good HBD story, much less test it. But then it shouldn’t be reshaping our ideas of social phenomena that we understand better, however incompletely.

When I was in undergrad, there was a transhumanist prof with a small personality cult, which is where I first heard about it. I started reading LW quite a bit as I was a psych student interested in cognitive biases and GEB at the time. After a while, I started to realize just how much of it was just repackaging basic ideas from cognitive psychology, decision theory, behavioral economics, etc. sprinkled with some original bullshit on top. I had way too many conversations that sounded like the Pigliucci/Big Yud debate.

I discovered LessWrong and it was so palpably stupid on its very face that I immediately hated it so much I couldn’t look away.

I considered myself part of the skeptic community for some years because I liked Skeptoid, SGU, P&T Bullshit, and a handful of YouTube channels on science and religion topics. (One of them was AronRa, who blessedly turned out not to be a human disaster like many other YouTube skeptics.) I was very much a “le STEM master race” guy and I deeply regret it, and I’ve moved away from those kinds of positions over time. My politics also moved a lot further left and that eroded the whole libertarian aspect that pervades a lot of those spaces. The responses to Elevatorgate and the full realization of the conservative underbelly of the Center for Inquiry were the final nails in the coffin for me.

I had brushed up against Big Yud and LessWrong in the same contexts as RationalWiki searches on occasion, but it was very much a surface level thing and I didn’t realize what was under the surface. I’d also heard about HPMOR and thought it was some cool aspects, but my inherent dislike of fanfic kept me from diving in. “Rationalist fiction” was another thing I’d been exposed to in limited amounts and found really weird.

A couple weeks ago someone darkened my Twitter timeline with a mocking screencap of Yudkowsky’s navel-gazing, about how most people didn’t have deep thoughts and weren’t working to save civilization. I searched some related terms on Reddit, put together some of the pieces, and found you folks.

I used to be a STEMLord too, though unwillingly, mostly due to parents. But I think I was always going to turn into a leftist because my dad has a massive grudge against white colonialists, which I have inherited with a vengeance.
RationalWiki is still largely going "what the fuck happened to the rest of the skeptics". When the Great Atheist Movement Schism happened, I can say we're entirely happy with the branch we took.

(a cut’n’paste from my tumblr, hence the capital-free lowercase tumblr poetry)

i started reading because a friend was signing up for cryonics and was an active participant. (since i am for my dilettantism an actual expert on scientology, mutual friends literally deputised me to talk to him and see if he’d joined a weird cult. my verdict was “not really,” which remains my verdict.) my previous opinion of cryonics was neutral-to-positive, i looked into it and went “wtf is this shit.” the rationalwiki article on cryonics was the main result. i joined lesswrong in late 2010 ’cos it looked fun. went to a few of the meets. was put off attending one ever again by the vociferous support for scientific racism. apparently scientific racism is essential to being considered a true rationalist.

it took me years to realise there was no “there” there, that all the dangling references to references that refer to other references never resolve: that yudkowsky has literally never accomplished anything. he has literally no results to his credit, in his claimed field or out of it. he’s a good pop-science writer, and I highly respect that. i’ve read the sequences through a coupla times and there’s good stuff in there. and he’s written literally the most popular harry potter fan-fiction, for what that’s worth. but in his putative field, his actual achievements literally don’t exist.

and the papers! holy shit, these are terrible! TDT, CEV - i am not a scientist, but i have enough experience as a skeptic to tell when someone’s spent 150 pages handwaving away the obvious hard bit and playacting at science. the best thing to say about them is that none of them start “Hello, my name is Kent Hovind.”

i recently looked up my early comments and i’m amazed how optimistic i was that the weirdy bits could be dealt with by sweet reason rather than being the point of the exercise. “taking ideas seriously”, by which they don’t mean “consider this philosophical notion in detail in the abstract”, but “believe our utilitarian calculations and act on them because of our logic.” even scott alexander called this one out.

i went back and read every post in main from 2007-2011. you can see it getting stranger and stranger from 2009 on, as people took in and extrapolated the sequence memes. i would say that peak weird was the basilisk post in july 2010. this i think scared people and the weird seemed to noticeably scale back. they also started the regular meets, for slight interaction with reality.

i mean, i don’t claim a stack of achievements either, i don’t even have a degree, but at least i haven’t started a million-dollar-a-year charity to fund my fanfic and blogging.

I've also noticed how the phrase "interested in ideas" (or similar) seems to have come to mean "accepting various arbitrary meta-ethical constraints and our particular community shibboleths"...
Isn't that every community? Or do you find the hypocrisy more jarring when it's from purportedly rational people?
As a norm, sure. But I meant the specific phrase "interested in ideas" coming to mean something much more specific.
are you serious that the basilisk thing is what scared people straight this is like watching kittens jump at their own shadow but not cute
sir has correctly, etc.

Constantly seeing culture war threads wherein the most bizarro rightwing psuedoscience is given it’s 15 minutes but first day leftwing stuff (like the link between poverty and crime) is treated as an impossible conjecture. I like the general idea they try to promote on SSC, taking ideas and challenging them, steelmanning etc, but there are so many actors acting in bad faith that it’s impossible to really see stuff get a fair shake. There seems to be a lot of bias blindness on that subreddit.

Jared Diamond is pretty terrible in his own way

What’s that about?

As I understand it, he way oversimplifies the role geography played in human history and flat-out misrepresents a lot of things. For example, in GGS, he talks about this one battle where Cortes and 40 Spaniards massacred like a thousand Incan soldiers, because Cortes had horses and guns. This just isn't true. Cortes had help from other indigenous people. Part of the reason he was led astray is that he read Corte's diary, and failed to make allowances for how Cortes would be naturally inclined to puff himself up and fail to give credit where it was due. The other big gaffe I recall is that he wrote this long article about how before the invention of states, pre-industrial people would get caught in endless cycles of blood feuds. He based this article on interviewing his indigenous friends in Papua New Guinea. The article came out in the New Yorker and his friends got really mad at their depiction and sued him. Turned out that men like to exaggerate their accomplishments, and the tribe was in reality less aggressive than Diamond thought. A real anthropologist would've taken male braggodocio into account before publicly declaring that so-and-so group of people were incredibly bloodthirsty. edit: link: https://www.forbes.com/2009/04/21/new-yorker-jared-diamond-business-media-new-yorker.html#7458a7ae6e71 Oh it's even worse than I thought. He didn't even use fake names in his New Yorker article. What an ass.
For further info, here's [a post from /r/badhistory about GG&S](https://www.reddit.com/r/badhistory/comments/6pm3he/whats_the_issue_with_jared_diamond/) that has various people more knowledgable on these subjects weigh in on issues with the book.

[deleted]

When I realised that the average rationalist was a smart pathetic looser.

They may be smart, but they are weak, and except for a weird quirk of society that only exists right now and will soon cease, they are failures.

The combination of the self congratulatory (“we are better for being pathetic”) , and the whining (“women should obey me and suck my dick whenever I want”), is pathetic beyond belief.

Such failure cannot be tolerated

Quotes above are not real quotes but my impressions of the average rationalists attitudes.

The only position they deserve in any society is that of the slave.

By the way, I may be disgusted by them, but I still believe race and IQ , two parent families are best, sexual liberation is only moral when birth control and std treatments are available, retirement is an antisocial evil concept, etc.

Reality belongs only to those who take it, and the typical rationalist could not take a sandwich out of a bag if their life depended upon it.

I...can't say that I agree with you, but this is so mean it's funny.
Yes, I understand that. I can be, as the SSCiers would say, very uncharitable. I think I might be an asshole. But, I am trying to change against my biases. Hence why I am reading sneerclub. My instinctive inclination is extremely right wing , and very close to outright fascism, with all it's usual attendant beliefs. Tbh, at first take, things like neoreaction, and traditional fascist, sound utterly amazing and the ideas just feel so right, like lego bricks fitting into place. On the other hand left wing ideas, egalitarianism, and things like social justice feel wrong wrong wrong - my immediate reaction to them is absolute and utter visceral disgust. But as I said before, I cannot tolerate failure. Failure is worse than any disgust. Traditional, and neo, reactionary and fascist belief systems have failed. Every. Single. Time. So I am trying to alter my biases. To continue to make the same mistakes of the past is unforgivable, and utterly pathetic. Only a child has any excuse for such behaviour, as for them it is not the past but completely new. Hence why I am reading sneerclub. Sneerers also usually come across as a complete and utter bunch of cunts too though. But this can, sometimes, make it clear why something that feels natural and right to me is actually wrong.
I mean, I don't disagree with you that rationalists are really sheltered in ways that they don't even understand, and the resulting social incompetence is really cringeworthy, but I hate rationalists and I wouldn't go to quite this level of contempt. That's all. Might seem like splitting hairs, but this is an online discussion after all. > On the other hand left wing ideas, egalitarianism, and things like social justice feel wrong wrong wrong - my immediate reaction to them is absolute and utter visceral disgust. I have a book rec for you. It's called Hierarchy in the Forest, by Christopher Boehm. It's about how small-scale, egalitarian societies use social shaming and insane, incredible violence to maintain egalitarian social orders. I feel like you'd really like it. Maybe it would help with your disgust towards egalitarianism. You sound like you really hate weakness. I think you should read more about muscular leftist sentiments that are about bashing oppressors, and less about being inclusive of people's feelings.
Yeah, I probably am being a little too contemptuous. I take to extremist positions, and flip flop between them, very easily. This presents itself in my opinions and judgements of, and attitudes towards other people/groups. I may have been overly harsh on them, and maybe on you sneerers too. I will look into Hierachy in the forest, the goodreads blurb and your description sell it well. I detest weakness, I think, because if left unchecked, and encouraged, it will get us, and everything else killed. Dead things have no values; A dead universe/world is meaningless. I see capitalist pollution to make pointless shiny baubles and most other pointless luxury consumption, encouraging transgenderism and egalitarianism, inclusivity, and the logically inconsistent masturbation of libertarians all as weakness that makes death come closer with every moment it is left to spread. Thanks for the book prompt.
No problem, I think you'll like it. It's very readable even though it's by an academic, and I'm pretty sure it'll appeal to your views of human nature. I do think you should go a bit easier on weakness, though. Human babies are born weak and the reason we have the cognitive abilities that we do is because we allow them to grow up very slowly, and we protect them during this long period of weakness. We got to where we are by being really good at cooperating, like bees and ants.
You might be right. I'm not sure what to believe anymore really. Everyone just looks like greedy selfish hedonistic parasites to me now, left wing and right. All I hear, on all issues from all sides, is blah blah blah I am too weak to get what I want in a fair way so I will [band together with other weak people and use implicit threats of social coercion, exclusion and the implied poverty and death that results/ use parasitic ownership enforced by leveraging the implicit threat of violence of the state and the implied poverty and death that results] to make other people enable my lifestyle so I can have all the pleasure and status I think I deserve. It disgusts and terrifies me. As for the weakness, babies being weak is entirely fine because they grow stronger as they mature. But if you do not grow stronger, do not learn from you mistakes, demand that others clean up after you, that others with less than you give you more, when you do not even deserve what you already have, because you think you should have it.... Encouraging that kind of existence is not self sustaining. The key thing I think is the recognise that everyone is weak to some extent, but that you should strive to overcome it and improve, not use that weakness as the foundation of a group identity to band together and target and bring down those who tried to make themselves better in some way. Cooperating may make us strong, but enslaving others to work for you is not strength to me, it is weakness - you should try to do it yourself, not to do so is pathetic weakness - only a weak man needs slaves. And so is cooperating in a coalition of the weak to pull down the strong out of jealousy and spite. Pathetic - only a weak man needs to destroy those better than himself. Although I think it is the same type of person in both cases. Leftist sentiment looks like people without capital trying to use gender and race to leverage the violence monopoly of the state in their favour instead of property rights(as is done by those on the right). It all looks like weakness covered up only by the overwhelming mass of normal people. ahh, I apologize for the rant.
I think you need to read more leftists. Leftists generally want an equal society, and quite a few of them do not recognize the state, and do not want to live in a society with the state. You should also realize that the people that leftists want to pull down have cooperated to hold down the marginalized. There's cooperation on both sides. There has never been any such thing as a single individual managing to dominate a whole race or gender on their own. Men cooperate with men to hold down women; whites cooperate with whites to hold down blacks. I don't even know what you think "weakness" is at this point. It looks an awful lot like simply being a human who wants things is a weakness to you. I think you perhaps shouldn't even bother reading more thoughts and opinions about politics at this point. You sound like you have trouble relating to normal humans period.