r/SneerClub archives
newest
bestest
longest
67

The RationalWiki article on LessWrong ends its opening paragraph with this quote:

the community’s focused demographic and narrow interests have also produced an insular culture that is heavy with its own peculiar jargon and established ideas that often conflict with science and reality.

We all know rationalists love to play around with dubious ideas and slap some “science” on them, but what are some of the biggest misunderstandings you can think of? I think it’s important to point them out, gullible people can fall into their claws and believe anything they say.

[deleted]

> Yudkoswky’s grasp of quantum mechanics is even worse than mine, and his musings on personal identity and simulation are cast-off science fiction. Yudkowsky's attempts to explain quantum mechanics on his blog are so absurdly contrived that even with decent background knowledge in the framework of quantum mechanics it's almost impossible to make out what the hell he is actually trying to say. [Just look at his attempt to explain the quantum interpretation of the Mach Zehnder interferometer experiment in this blog post](https://www.lesswrong.com/posts/5vZD32EynD9n94dhr/configurations-and-amplitude).
You should know by now I can’t resist reading this shit, how dare you do this to me
[deleted]
My eyes just glazed over when I got about four paragraphs into your link and realised he was just going to spend the rest of the piece drawing out his dumb conclusions from the most basic shit you could glean better and better written from Wikipedia - which is presumably all he has to go on - and it was infuriating to realise he was going to draw out all these physics and metaphysical conclusions from shit I picked up in *GCSE Physics* Which, for the viewers at home who don’t know what a “GCSE” is, means I picked it up from classroom demonstration experiments of QM principles when I was 16 years old in fucking *high school* God, even speaking as a writer, I fucking hate this guy.
those "layman" explanations of quantum mechanics are significantly more complicated than the tiny bit of matrix algebra required by the original problem. next time someone tells me scientists obfuscate on purpose, I'll send them this. the whole point of a formal mathematical system is that we don't need six pages of prose for every new physical system.
Also note that Yud never mentioned the word "phase" once in that blog post.
The main issue with gwern is that he takes psychologist frauds seriously. Shockley, Burt, Lynn, Rushton, Jensen, the whole lot. Given his brash propensity to consider the entire fields like nutrition BS, and his acknowledgement of the replication crisis, you would expect him to be among the first to dismiss a dude who takes the averages of neighbouring countries as an "inference" of a country's "national IQ", but w/e
the main issue with gwern is that he is a dumbass without critical thinking, just like most rationalists. it's so puzzling to me how many ppl here think gwern is smart or something
it's because he has a very nice-looking website. hot people privilege but for webpages. hot page privilege or something
fonts > quants
A lot of confidence and a big vocabulary will convince a depressing number of people that you’re smart.
> Scott Siskind’s stuff about politics and political science//theory doesn’t realise it’s almost exclusively about his own personal insecurities Isn’t he a psychologist? Shouldn’t he have a pretty good grasp on his own insecurities?
Psychiatrist And there’s a canard about psychiatrists that they’re always their own worst enemy when it comes to psychiatric matters
lmao noactuallyitspoptart still at it he knows quantum mechanics, statistics, politics, philosophy better than all of these people but he's here literally every day bitching about these people. why let that talent go to waste bro? man i'm having a real fucking laugh reading your profile history this is what you get up to??? hahahhaa
> he knows quantum mechanics, statistics, politics, philosophy better than all of these people That's not a high bar to clear
It’s weird that this is even the rhetorical point in play: I explicitly said that I don’t know enough QM, but that Yudkowsky knows less than me I’m explicitly being self-deprecating about the Yud For everything else whatever, this is some weirdo who’s clearly been following me around trying to troll me, or something creepier
The reason I let that talent go to waste is that I’m a hedonist and recovering alcoholic, you can read my latest work in the next issue of YONQ magazine
i see that masters degree you got went to good use lmao... thousands of posts in reddits dedicated to laughing at other people failing at philosophy when you failed harder than all of them. jesus christ i wish these people knew who's behind this pathetic shit
Making an alt account to shit on me for things I am deliberately open about seems less noble
guy whose day job is trawling through blogs and twitters of people he doesn't like, to find things he and his echo chamber buddies can sneer at, is talking about nobility. rich buddy... i mean you know you're grasping at straws when you start whining about someone's layman quantum mechanics explanation because it's... too simplified? really? fucking lol i guess after so many thousand posts you run of out shit to whine about
I don’t have a day job, anyway you’re banned, make a new weird fucking alt if you want to stalk me more.
Lol the new alt got auto-spammed by reddit
I have a pretty good idea who’s behind the alt, but that’s still fucking funny

Human biodiversity. Pretty much anything about sex and gender. Those are the big ones.

Overestimation of the practicality of interplanetary (let alone interstellar) colonization, maybe? They seem to overlap with the space cadet crowd somewhat. Also a general knee-jerk technophilia.

what I find particularly funny is that they imagine that interplanetary colonization will be possible in a free market system. looking at history, you need *massive* amounts of state funding and planning to pull off any sort of space shit. it's never gonna turn a profit, even on a timeframe of decades or centuries. (Note: SpaceX is not a counterargument, they're mostly government funded and still way less effective than Nasa.)
> (Note: SpaceX is not a counterargument, they're mostly government funded and still way less effective than Nasa.) There's probably a joke here about the contributions of Nazi rocket scientists to NASA and rationalist "race realist" computer touchers to SpaceX, but I have to admit I'm not clever enough to make it work.
What I find funny is that they imagine interplanetary colonization will be possible at all.
> looking at history, you need massive amounts of state funding and planning to pull off any sort of space shit. it's never gonna turn a profit, even on a timeframe of decades or centuries. In centuries (quite possibly before the end of this one) we're likely to have automated factories that are self-replicating in the sense of being able to take in raw materials and energy and manufacture copies of every machine in the factory. Combine that with automated mining to extract raw materials from bodies in the solar system, and that would change the economic calculations by quite a lot (and possibly also usher in the end of capitalism).
we have that sort of factory now. it is called a "eukaryote" or a "bacterium", and it turns out that when you put them in space they tend to die immediately
From this fact do you generalize that *any* possible self-replicating system would break down in space, including a non-biological one composed primarily of robots and 3D printers more advanced than what we have today, and designed to be able to function in space? If not, how is this an argument against my point?
how are you going to get a massive physical plant requiring careful maintenance and long-term planning into space in the first place? (under capitalism) (if you're saying, "we can develop and get into space self-replicating mechanical systems under neoliberal capitalism", i'd say you're absolutely wrong. but under communism... maybe in another 3 or 4 centuries.)
Any self-replicating factories would probably be developed on Earth before they were put into space, and I think that would already be likely to lead a breakdown of capitalism. Possibly it could also lead to some weird hybrid form where capitalists continued to make money off intellectual property rights while mostly outsourcing the actual production of goods to publicly owned automated factories--I would hope that even if such a hybrid form develops, it would be unstable in that workers would increasingly tend to favor worker-owned businesses, given that setting up such businesses would no longer require such large amounts of capital to buy expensive means of production. But even if one assumes capitalism doesn't break down, fully automated production would tend to make launch vehicles much cheaper, and it's also hard to say how far one could go in minimizing the mass of a self-replicating set of machines by say 2100 or 2200. So I wouldn't completely rule out such a factory being set up in space under capitalism given the huge potential profits once it was able to start self-replicating and mining resources in increasingly large amounts, without any further investment from whoever sent the first one up there.

They will wax on and on about biases yet they—as a group as well as as individuals—seem particularly incapable of introspection. Feelings are sinful and despicable, except their own feelings which are at worst righteous, secondary, and certainly always under their masterful control and never at risk of influencing their superior rationality.

 

They have a distinct taste for cheap evolutionary psychology and will latch on every just-so stories and bad metaphors it provides them. The more gender-conforming and misogynistic the story the better—unless they can use it to reinforce their polyamorous and very open-minded nice male stud perspective. If the story is rhetorically dangerous and too transparent they have a propensity to resort to equivocation as a mean to restore respectability.

 

Mathy is good. Bedazzlingly mathy is unquestionable. Everyone sucks at statistics except themselves. If someone can’t math it’s risible. If they are caught incapable of mathing it’s because their napkin was too small and they were in a rush or on their way to some important function. They are at the center of a pandemic of multiple degenerative engineer’s and physicist’s diseases. Everything has a magical technological solution, we just haven’t found the right objective function to optimize yet.

 

Everything wet or metaphysical can be very smartly distilled into a stupid dualistic metaphor involving hardware and software.

 

Thought experiments always weight more than empirical evidences because crusty old big brain physicists used them all the time or something. Hearing of anything remotely related to gender-nonconformity, feminism, historicity, or social justice, will whip them up into a jubilant frenzy at which point they will beg the question and collectively converge on a mediocre thought experiment that will serve forevermore as an irrefutable falsification of the aforementioned. They will then begin parroting the latter at the drop of a hat.

 

Bitcoin is big brain coin and therefore always good no matter what.

 

The cult of the solitary promethean genius runs deep.

 

IQ and racism and all that dung; they love debating metrics of happiness, utility, productivity, safety and whatever objective function they’re currently jerking themselves raw over, but metrics that, once again, reinforce their distaste if anything too brown and very scary and not suburbia and not North Shore Chicago is definitively very very empirical and truthy and never contentious indeed.

 

I’ll stop here because writing this made me sad.

*chef's kiss*.

The biggest misunderstanding of sience of rationalists is thinking that every phenomenon known to mankind can be explained by stuff that is taught in the first week of a statistics class.

https://xkcd.com/793/
I think there's something pretty deep about this observation. Someone will come along and articulate it better (or already has, here or on this sub) than me but I think that's pretty typical of the movement, they come upon issues/schools of the thought, dismiss the experts or the consensus, and attempt to rebuild it with some mishmash of 'noting' biases, basic stats, pop-ego-psych, and a strong and narrow version of utilitarianism, and all from their armchairs. I think Dominic Barton is a great example. Yes, he helped orchestrate Brexit, credit for that, but with the virus he, as I understand it, pushed the herd-immunity idea in the face of strong pushbacks from the relavant experts, and we know how that went. That is their approach, writ large; dismiss the experts as cloistered, small-minded, and narrowly focused thinkers who can't see the big picture and write a blog post about their 'insights', which as people have already said here, are either overly complex and inarticulate mundane insights already known to many fields, or downright ignorant of how their views have already been roundly criticized and the flaws enumerated. Scale that up to their concerns about political science, philosophy, and the organization of society, and I think it's possible to see just how blind they are to actual material constraints to their 'visions' (moldbug), but also their embrace of a variety of normative claims, like effective altruism (not that there isn't some merit to that thinking).

I bring this quote out every time this comes up, but it’s still just absolutely shockingly bad an emblematic of Yud’s blindspots:

Riemann invented his geometries before Einstein had a use for them; the physics of our universe is not that complicated in an absolute sense. A Bayesian superintelligence, hooked up to a webcam, would invent General Relativity as a hypothesis—perhaps not the dominant hypothesis, compared to Newtonian mechanics, but still a hypothesis under direct consideration—by the time it had seen the third frame of a falling apple. It might guess it from the first frame, if it saw the statics of a bent blade of grass.

I did a whole post dissecting just how bad this quote is, and it’s actually a key part of his AI safety argument! He thinks that intelligence, on it’s own, gives you superpowers, without the need for evidence, or experimentation, or resources, which is a fundamentally anti-scientific point of view.

So what I'm getting is that Yud thinks intelligence works like [this](https://youtu.be/0JI9LmB1FZY)
The man *did* write HPMOR, after all.
I am just a simple man who has read hume but isn’t why this is wrong literally exactly humes whole shtick
I think this is one of rationalism greatest mistakes, you stated it well. Thousands of years of human thought demonstrates that the natural world resists explanation reasoned from first principals, but yud and his crowd think that if you just had more compute cycles this problem would go away. I think anyone who researches or even works in any complex system should realize this is just wild optimism.

Maybe the central and fundamental flaws of the Rationalist community are the ways its adherents fail to practice their own methods.

Narrowly define Rationalism as an attempt to overcome human cognitive biases. Okay, then one of the worst families of cognitive biases involves ingroups vs. outgroups: racism, the -phobias, political tribalism, addressing the imaginary category instead of the real individual. Where are all the parables and exercises about those biases? Well, if anything that’s the project of the antiracism movement that’s gained so much momentum: raise awareness of and teach resistance to mental blocks like privilege, white fragility, systemic racism, the present-day legacies of slavery and colonialism, etc. Those are influences that cloud our judgment and affect all of us in varying degrees. (When I first came across the blog Overcoming Bias I thought that’s what it’s about because of the name. lol) But while Rationalists have elaborate games for overcoming the availability heuristic or the framing effect, on these issues they just throw in with the Gamergate lost-causers and blame it all on SJWs instead of System 1. They even bring tribalism into places where it wasn’t already, like inventing “conflict theorists vs. mistake theorists” to turn two situational behaviors into two irreconcilable categories of people.

In a sense Rationalism is almost a self-help exercise as an intellectual discourse. Yet instead of praising each other’s mastery of bias-denial, they jerk each other off about innate ability. A community of rationality-practicers fetishizing IQ is like a community of bodybuilders fetishing height: yeah, maybe it gives you more leverage on the weights, and it’s another factor in an imposing or attractive physical presence, but it’s not something you worked for or can help others attain for themselves? They have a philosophy built on the growth mindset and a community focused on a fixed mindset.

And then there’s the way they selectively choke themselves off from the fresh air of outside ideas. Scott Siskind might have the most extreme examples. On one hand, he “steelmans” in great length and detail the neoreactionary philosophy, which is the political equivalent of flat-earthism discussed only by weirdo assholes nobody in the mainstream has heard of (partly because they don’t want their real names used). On the other hand, he can spill thousands of words complaining that “you are still crying wolf” without even specifying who you is. That’s his worst essay not just because of the conclusion (Trump isn’t a racist and you’re a bad person for saying so), but because of how he gets there: by not citing or responding to any specific arguments from any specific people, by ignoring the most obvious and discussed-to-death counterexamples (e.g. housing discrimination case, Central Park Five, birtherism), by playing down other counterexamples with the dumbest possible context-free literalism that couldn’t possibly stand up to the slightest dialogue or editing, by pointing out that previous candidates like Clinton and McCain used similar language as if everyone everywhere agrees they’re not racist, by using David Duke as a reference point without even taking ten seconds for an internet search that would show Duke already proved he’s totally not a racist in exactly the same way Siskind is proving Trump is totally not a racist. It’s an extraordinary work of intellectual introversion, written as if he not just failed to anticipate counterarguments but actively covered his ears to avoid hearing the ones that were already ubiquitous. Fringe nutjobs with no influence get the steelman treatment, while the most talked-about issues involving the most powerful man in the world for four years are only worthy of shouting angrily at some very naughty strawmen. I’m not sure someone could have written such a weak argument without first being convinced of their rationality, beyond any self-doubt, by Rationalism.

I'm a leftist Rationalist and I agree that for whatever reason they use the same tools I use and come to completely status-quo obsessed conclusions, when in reality we know the status quo is the EXACT OPPOSITE of where we need to be as a society. Most systems humans have figured out do not work in the long term and currently cause more suffering than theoretical new systems, yet the many in the Rationalist community are pushing dumb conservative centric policy ideas.

They disregard many fields of study as too ambiguous to be taken seriously yet they can’t stop talking about those subjects, so it becomes an excuse to only listen to rationalist blog posts talk about those things, but in different lingo.Stuff like:

“Experts disagree on ethical theories and its way too broad a subject to be a true expert on, but obviously consequentialism is correct”

“Psychology papers don’t replicate, but here’s a theory about what motivates people to do X”

“This type of analysis is just motivated speculation. Anyways, here is a fable of a merchant trying to ford the Nile as a metaphor of my feelings on the matter”

So they often use the excuse of having high standards for evidence and rigor to …. lower the standards for evidence and rigor.

Nailed it.

i’m trying to think of one thing they’ve gotten right. they’re wrong about ai - just ask any reputable ai researcher you can find. they’re wrong about women - just ask any woman you can find. they’re wrong about epidemiology- just ask everyone who has ever laid eyes on dominic cummings. they’re wrong about bayes - just ask any reputable mathematician. they’re wrong about logic - just ask any reputable logician. they’re wrong about socialism - just ask any reputable socialist. they’re wrong about history - just ask any high schooler who’s read a history book not written by a white supremacist. they’re wrong about race - just ask any scholar of race. i literally can’t think of anything they’re right about. this is why you should never argue with them and just sneer and make fun of them.

> just ask any high schooler who's read a history book not written by a white supremacist. This is exceedingly rare in the US.
Yup. "Lies My Teacher Told Me" is a fantastic book to read on this topic.
wincing grimacing meme man dot png
more like MoreWrong, amarite?
Their takes on transness are, on occasion and as of late, not *entirely* terrible? As a trans person, I actually kind of liked SSC's blue whale and hairdryer metaphors. They're imperfect, but together they kind of approach the idea that gender and disability are both social constructs and you just want to scream "yes!!! now apply these ideas to race, and also patriarchy!" but they never do.

I wouldn’t be able to speak to their perspective on hard science, but I think they chronically underestimate the degree to which “human nature” (and so on) is a metastable hyperobject and not a solely intrinsic property of a human being.

Or in less jargon: even when they’re not blatantly misusing statistics in order to prop up scientific racism, they still love to make sweeping proclamations about human nature and culture as if they’re measuring the properties of a chemical, when in reality they’re just speculating on a shifting system for which their own speculation will also play a part in influencing.

Their biggest and most foundational misunderstanding: they conduct science like it’s a philosophy, and philosophy like it’s a science.

Did you mean hard science?
But those two are the same thing
only in a pedantic, "gotcha" way. In modern society science and philosophy are conducted in vastly different ways, and they do things the wrong way around. You can't "deduce" general relativity by thinking hard about a blade of grass bending.
Chill, it was a one-liner

The Bayes thing stands out to me. Here’s a fun game: ask a rationalist why they haven’t graduated to using Kalman updating yet.

The entire community is chock-full of “scientific racism”. I had Richard Lynn papers quoted at me as though they were real sources.

[deleted]

This is a good point that reflects others in this thread: a huge amount of the rules for “rationality” they espouse is cribbed pop-psychology, which then leads into the uncritical advocacy of stuff like HBD because they manage to convince themselves and each other that they know more than they do In particular I would point to my own comment in this thread (as well as elsewhere) that it’s just a cult Institutionally the foundations of “rationalism” specifically disbar it’s adherents from noticing they’ve embraced a very specific worldview which should otherwise be worth questioning, because their ways of thinking are *rational* as opposed to *irrational* The pop-psychology is really the basic ground on which the cult is built, it motivates and structures the way its members think about themselves Just look at the way they at least *say* they’re trying to piece together a mirror of reality from this handful of anecdotes that the cult leaders are at least smart enough to hand down
Now that I think about it, there's a certain similarity with some Very Serious People in that, just like said VSPs, they've convinced themselves that their ideology is the lack of ideology.
>their ideology is the lack of the ideology Spot-on. They always talk about how 'politics is the mind killer' or claim to be 'radical centrists' when to anyone with a basic understament of politics it's beyond obvious they are right wing conservatives.
Quite
Agreed. Rationalists presume their own rationality way too much.
This is the most succinct and accurate way to answer OP.
Imagine basing the entire western world's approach to law enforcement on the crackpot theories of a handful of right-wing grifters.

ATC (formerly SSC): 1. Unjustified certainty that the brain implements Bayesian statistics in processing sensory inputs ( e.g. vision). 2. Overinflating the importance of certain GWAS results (though as a group ATCers are hardly alone here).

There is no kind of smart that makes you uniformly good at everything. Rationalists place far too much weight on general intelligence as opposed to domain specific knowledge.

intelligence, and no enough on specific knowledge and skills.

Aumanns theorem doesn’t work in the simplified form “rationalists should agree”

Bayes doesn’t work as sole epistemology. It can’t generate hypotheses, or backtrack. Rationaliss make very little use of it in practice.

SI doesn’t work as a generator of realistic theories/truth.

There’s nothing about AI that allows you to transcend ordinary computational complexity. Yet rationalists keep obsessing about uncomputable things like AIXI, which are of no relevance to AI.

Utilitarianism works much better as a theory of charity or voluntary donation than as a general theory of morality.

Yudkowskys writings on QM are confused about what MWI and Copenhagen even are.

MWI is not obviously simpler. Particularly the decoherence based form.

could you please expand on the MWI thing? I have the impression that Yud got MWI mixed up with Modal Realism. Is this true?
There's a couple of things going on. In [Against Modal Logics](https://www.greaterwrong.com/posts/vzLrQaGPa9DNCpuZz/against-modal-logics) He rails against a thing he calls "modal logic". "Modal logic" is the name philosophers have for formal language for talking about possible worlds, that doesn't involve any commitments to the realism of non-actual words. The thing he objects to content-wise is the thing called "modal realism", the claim that possible worlds exist, somehow, somewhere. The further confusion is that this essentially the same claim as the many worlds interpretation of quantum mechanics, which he, of course, strongly *supports*.

its perfectly logical that people who misunderstand AI must also misunderstand statistics.

Use of neuroscientific evidence (often, abstract of fMRI papee) to “prove” some other point. As if this magic sciency method automatically and majestically reveals exactly what the brain is doing.

Goes double if they invoke St Friston

everything that is justified with “evolutionary psychology”

[deleted]

> mathematics (and its sub-fields Physics, Engineering, Chemistry, etc.) Meseemeth you talk out your ass just as much as rationalists do.
My man just deleted his account after posting this. RIP.
Nah, the account is still there, just the comment was deleted. But now I feel bad I didn't quote the rest of the dumb shit in the comment :(