This is not a bad comment, all things considered. The quoted article seems broadly reasonable, and aside from the questionable scare quotes the framing seems like a nice relatively neutral discussion starter. It’s also, I think, the comment that has finally snuffed my hope that there’s a germ of a reasonable worldview underneath the weird pseudoscientific racism and gender essentialism.
Chalk it up to “inferrential distance”, I suppose, but I have trouble imagining anyone finding this insightful or novel, regardless of their political leanings. It’s not even PoliSci 101. Separation of Powers as a check against tyranny was an idea I picked up from a junior high civics lesson. We literally teach this to children.
There’s an old joke where I’m from about people who like both kinds of music: country and western. I get the impression that SSC appreciates both fields of human knowledge: mathematics and the hard sciences. And even then only up to about a sophomore level.
I want to like the rationalist project, I really do. Learning to update my beliefs when the real world disagrees? Sounds good. Recognizing and working around (some of) my cognitive biases? Sign me up. Setting aside rhetorical skill and focusing on the underlying facts? Aways a useful exercise. I’ll even climb on board the consequentialism train. But the more I actually interact with the community the more I notice that the gulf between theory and practice is just staggering.
I can produce a cogent libertarian argument if called upon. Classical liberal too. Hell, at this point I could probably manage a passable impression of a white supremacist position. I may disagree (often vehemently) with the premises, but I can follow arguments made from those premises and largely guess where people who do accept a set of premises will end up on any given issue. The apparent fact that no one there can do the same for my fairly milquetoast progressivism is telling. The fact that no one see this as a giant red flag about which direction the inferrential gap might flow doubly so.
It’s frustrating that nobody makes the connection between “epistemic humility” and privilege.
It’s frustrating how few people seem willing to subject their stereotypes to the same level of critical analysis that they do the other well understood cognitive biases.
It’s frustrating that nobody seems willing to wrestle with the sheer variety of previous “race realist” arguments which were both superficially convincing and demonstrably incorrect.
It’s frustrating that nobody makes the connection between misleading emotionally resonant anecdotal arguments and holding up James Damore as a martyr at every available opportunity.
It’s just frustrating, start to finish. So WTF went wrong?
[deleted]
I can’t stop thinking about a question I saw on History Stack Exchange recently: “Why was the Cold War carried out over the whole world instead of between Siberia and Alaska?” Everyone thinks the USA and USSR are far away from each other, but upon careful study, it turns out that they’re actually right next to each other. Why wouldn’t they just launch nukes at each other over their nearest border?
It’s ‘logical’ for someone to ask this after looking at a globe, yet the question is filled with incorrect assumptions making it so nonsensical it can’t be answered in the questioner’s terms. At best you can put some work into explaining how it doesn’t work the way they think it does, but you can’t address their original line of inquiry because they are still staring at that globe and seeing Siberia nearly touching Alaska. And they’re not even wrong.
This is what the rationalists are constantly doing. They find data from one axis that seems to be indisputably correct but they don’t have the perspective or experience to avoid their unacknowledged assumptions when drawing a conclusion. When confronted with an existing consensus idea, they place the burden of proof on the current consensus to justify itself in accordance with their newly evidence-backed assumptions about the way things should work; if they haven’t heard of it, it must be reevaluated from scratch. They are prepared to adjust their priors if their data is wrong and think that’s sufficient humility, but this data is irrefutable and got them a conclusion that registered properly with their thought processes, so it’s already past that stage. They can’t accept that the data can be true but produce a false conclusion.
[deleted]
This is a great post, I applaud you for writing it.
That said, my answer to the questions you raise puts me in a radically different position than you.
“WTF went wrong” for me, is, well, actually “WTF ever was right?”
Yes, I agree with you, the complete inability of Rationalists to behave according to their own self-proclaimed principles is gob-smackingly stunning. And it’s frustrating in the same way that hypocrisy everywhere is frustrating, especially when the hypocrites are those who just love correcting the behavior of those around them (and no one loves correcting the Wrongs more than a LessWrong.) Your post captures several of the most glaring of these quite perfectly, but we both know you could’ve Alexandered that list of frustrations into another 10000 words or so.
But consider an alternative question: What if the very foundational principles of rationalism are just bad principles?
It sounds good if anyone, anywhere could agree on what constitutes the real world.
This one continues to confound me. It’s easy to recognize biases in others. But in myself? Surely any bias worth its salt would prevent me from recognizing itself?
Maybe this one’s just personal but I consider rhetorical skill a supreme virtue. (And as previously alluded, I don’t really believe in facts.)
I just can’t. Even if I believed in an objective material reality, I surely don’t believe in my ability to do a cost benefit analysis of an infinite number of actions or non-actions on that reality.
Here’s where I’m coming from. I’m no solipsist – I believe that there is some world outside myself, mainly because I keep hearing these voices coming from it. And I already have a voice, so they’re probably not mine.
But these voices are contradictory, and often incorrect, and even frequently incoherent and dumb. Which leads me to believe that my voice probably sounds incoherent and dumb to some of them, as horrifying as that sounds.
So how do we adjudicate between the voices? How can we determine, finally, which statements are true? We can’t. And for the most part, even attempting to try is a fool’s errand, as likely to drive us mad or cause someone harm as it is to help anyone. The quest for objective knowledge, properly understood, is a barely-concealed quest for power among and over the voices.
All we can do is try to hear the voices and help the voices. The voices seem to matter, at least to themselves. Whatever they are right or wrong about, it doesn’t matter. Let it go. Being correct is unimportant.
I guess what I’m saying is that sometimes the best thing you can do for the universe is to…. listen and believe.
Closed ecosystem of Less Wrong was vulnerable to virulent highly-evolved memeplex.
To be honest the real problem with the Rationalosphere, in my view, is that they’re just not very good rationalists. I still think the ideas of rationalism as self-improvement, overcoming bias, etc, are worthwhile and interesting. But Rationalism with a capital R as an internet commentariat subculture is pretty remote from that. It’s frustrating and a real shame, but the sooner you recognize that, the better.
To be fair, I would wager a lot of people posting are probably age 15-25. The average citizen functionally stops learning around late highschool (including college grads)
This is what is the killer imo