We always talk about the ridiculous beliefs that give shape to the whole LW ideology. But, in your opinion, what’s the most ridiculous one? Is it the ai gods? The simulation stuff? The almost religious faith in the MWI and the multiverse? or something else?
honestly for me it’s just the idea that Eliezer Yudkowsky is a once-in-a-generation Great Mind who is going to not just change the world but almost unilaterally save it. like the way he writes about himself reminds me of myself when i was in middle school, “oh woe is me i’m so smart, with great power comes great responsibility”, except he actually has a whole community that enables him. i just cannot comprehend how you can look at yud’s work, see him talking like a typical middle school “gifted child” and believe him, especially when most of his audience are themselves probably middle school-esque “gifted children” and this kinda involves placing yourself below him.
related, that whole thing where he went to prove that a godlike ai with superhuman intelligence could convince a person to release it to the wild through its incredibly superior intellect by roleplaying himself as the ai, like that takes some fuckin cojones
How about the fact that LessWrong’s “About” page unironically recommends a Harry Potter fanfiction as a good entry point into their philosophy, setting it on an equal footing as ‘the Sequences’?
The most ridiculous(ly fascinating) part of their ideology is that it seems like the natural progression of religion (i.e. Gods are replaced with robots and ai.)
I think technologically culty/religious groups like LW are inevitable and as time goes on we’ll encounter more and more of them.
Obviously it’s also ironic as fuck considering they’re all hard core atheists.
Word count.
It’s the simulation stuff. It’s definitely the simulation stuff.
Being essentially a non-theistic religion in denial, while lacking what religions normally preach, like difficult social values.
“On a sociological level, perhaps the most important function of a healthy belief system is to reinforce precisely the difficult social values, those that don’t quite come naturally. Religions don’t have to urge men to look at pretty girls, or to eat chocolate. But we do find medieval Christianity urging barbarians turned kings not to murder, to put away their concubines, to respect the life of contemplation. We find Hinduism counseling the poor peasant to worship his cow– to most Westerners, irrational advice, but in fact eminently sensible; if during bad times the peasant gave in to the urge to sacrifice his only animal, he is ruined..” - Mark Rosenfelder
The rationalists want to build a god aligned with pre-existing values, which enforces optimalness, in accordance to their preexisting values without demanding anything from them. Even from a secular sociological standpoint, Rationalism makes no social demands from its followers, no command to honor thy parents, to practice brahmacharya (celibacy) and ahimsa etc.
Bay Area group houses. Not a belief, but the scariest & ickiest thing I’ve ever learned about.
That they think they’ve internalized Bayes theorem so well that they’re essentially Bayes calculators continuously making predictions and vocally updating their priors.
Of course, the invocation of Bayes, for them, just serves as justification for holding on to and defending their pre-existing beliefs.
It’s the techno-optimism and the liberalism/libertarianism.
This: https://mobile.twitter.com/vgr/status/1172166598330740736
The MWI faith isn’t uniquely absurd, in the way that reinventing “Sinners in the Hands of an Angry God” like Deep Thought deducing rice pudding and income tax is quintessential for Extremely Online Rationalists. You can find the same lazy habits elsewhere, too: not learning different mathematical formulations of the theory that might make different interpretations seem intuitive, relying on third-hand gossip and caricatures of the early quantum physicists instead of what they actually wrote, not paying attention to the varieties your favorite interpretation comes in and how the advocates for it have disagreed with each other, etc. It’s bad, but it’s unremarkably bad. It only becomes ridiculous when the ego comes into play, and they argue that science is broken because it’s clearly not rAtIonAL like they are, when the best they’ve done is reproduce the same old arguments in their most broken form.
The most ridiculous part is that I bother paying attention to any of it!
Humans are deeply irrational in a way we can never fully avoid no matter how hard we try, especially regarding emotional and identity-laden issues like politics. Also, an unfettered free market of ideas will always lead to the best outcomes.
Too many to pick from but no one’s mentioned the “1 and 0 are not probabilities” thing.
“If you don’t want to fund cryonics with all your capacity you may as well commit suicide, since that’s what you are doing the slow way.”
Their atheism.
The whole thing about how science is dangerous and must be kept away from the unwashed masses, because the only people wise enough to use any given scientific principle safely are those who rediscover it independently. This is especially stupid because most of humanity’s greatest discoveries are the result of people doing literally the opposite of that. Like, as brilliant as Albert Einstein was, does anyone really think he’d have gotten as far as relativity if he’d had to spend several decades reinventing the wheel because Newton and Maxwell had decided to keep all their work secret?
Will Wilkinson nailed it: they prize a certain kind of contrarian thinking but are nominally Bayesians, which means they never actually update their priors?