This is a pattern i have noticed about rationalists in general: they take certain scientific concepts and mix them with unproven assumptions to take everything to its most terrifying conclusion. I mean, between: “many worlds interpretation is true” and “there are agents in other branches making copies of you and you probably are a copy and you are going to simulated hell” there’s a lot of thinking going on. Why you think this is the case? Let’s do some good old armchair psychology here.
I don’t have what we would call a “good take” on this but uh
In a conversation the other day, I couldn’t help but remark on how a self-proclaimed rationalist community had invented (a) God, (b) prayer, (c) Satan, and most impressively (d) indulgences, all from first principles.
(Sorry, I meant “Benevolent AI,” “Acausal bargaining,” “Roko’s basilisk,” and “give some money to MIRI,” respectively. Sometimes my autocorrect acts up)
the answer lies in their hatred for modern art. we shouldn’t take this just as an aesthetic revulsion. it shows a total loathing and terror for every element of the modern world, with its complexities like “women that don’t want to be talked down to” or “workers who want a fair wage” or “black people who don’t want to be killed by the police”. they could no more imagine a world without fear than they could imagine that star wars is actually extremely popular and liking it doesn’t make them special
Scaring people is a way to get people to take you seriously and give you money and attention? 🤷♂️
I call it fright-selling.
it’s been scary campfire stories for amateur philosophers for a while
Making fun of this phenomenon goes back to Jonathan Swift > Most of them, and especially those who deal in the astronomical part, have great faith in judicial astrology, although they are ashamed to own it publickly. But what I chiefly admired, and thought altogether unaccountable, was the strong disposition I observed in them, towards news and politicks, perpetually inquiring into publick affairs, giving their judgments in matters of state, and passionately disputing every inch of a party opinion. I have indeed observed the same disposition, among most of the mathematicians I have known in Europe, although I could never discover the least analogy between the two sciences; unless those people suppose, that because the smallest circle has as many degrees as the largest, therefore the regulation and management of the world, require no more abilities, than the handling and turning of a globe: but I rather take this quality to spring from a very common infirmity of human nature, inclining us to be most curious and conceited, in matters where we have least concern, and for which we are least adapted by study or nature.
My impression is that one of the pillars of rationalism is to think “logically” through everything and reason out conclusions even (or more accurately, especially) when they are counter-intuitive. I think it can be hard to demonstrate that you are doing this because few real problems are simple enough to see the logical solution outside of thought experiments and other contrived scenarios. The thought leaders do plenty of the latter, so how can the average user demonstrate how rational they are? One way is to express your opinion, but it’s an insular community so plenty of others are doing the same thing on similar topics since that’s usually what drew people in in the first place.
Enter your scenarios. Someone thinks up a scary conclusion based on the shared group assumptions. Expressing that you find these scenarios worrisome or frightening demonstrates that you have internalized the lessons and are applying rationality. It essentially is a declaration of faith. I think this works because most of these scenarios are only scary to followers. If you tell somebody on the street they should worry about a far future simulation of themselves as much as they do about their own person they’d tell you that was stupid and move on. I think everybody’s gut tells them that this doesn’t make sense, and so it demonstrates adherence to the tenets to ignore your gut and say “yes even though i don’t feel like this is true i think it is true, therefore it concerns me.”
Also,
is an accurate general description of the singularity/AI as X-threat sphere IMO
O u want armchair psychology? As someone who’s been rat-adj, I’ve been preparing for this my whole life.
afaict rationalists systematize a lot (to use the rationalist termology for it), which means they generally enjoy and do best with things that have simple rules and complex properties that they can safely explore without interfacing with stuff outside that system. With math and agents and quantum mechanics* (“many worlds interpretation is true” is ostensibly an application of Occam’s Razor). From these things, rationalists then extrapolate really hard and, if you extrapolate really hard, things inevitably become scary.
The things the rationalists fear resemble the rationalists themselves. The rationalist extrapolates really hard to get radical conclusions and, in turn, fears an AGI that optimizes really hard to get terrifying results. People often invents gods in their own image. The rationalists simply did the same thing and invented a god in the rationalist image.
But what do I know? ^Above is just assuming some basic properties about rationalists and drawing arbitrarily detailed conclusions about them. Reality isn’t simple rules with complex properties, its complex rules with complex properties. I’m gonna stop talking now before I get shoved in a locker
*kind of, physics is actually a mess but its not as much of a mess as lots of other things in reality
I can relate to this impulse, although it is only particularly prominent when I am feeling isolated and alone.
The user Mydradek posted here a great excerpt that highlights the sorts of things that I get worked up about. I am actually really embarrassed by the things I obsess over, including many of discussions that rationalists gravitate towards. I don’t think much of it is profound or fulfilling, rather being an impediment to thinking about more important things. A part of me thinks it is just a form of escapism from actual existential fears, like worrying about climate change and feeling helpless to stop it. The brain ruminates on the dread long enough that it is no longer immediately aware of the source of the anxiety, rather just focused on solving any problem it can to get a release. This is the only way I can explain my behavior to indulge in this thinking, because the only time such thinking emerges is when I don’t have meaningful connections or feelings of security, and I become lost in that anxiety and dissociate. Everything feels like a dream and I project my ruminations onto the real world.
I have a weak background in philosophy and logical reasoning, but I do see similarities in how some creationists and rationalists find ways to utilize probabilities to come to confusing conclusions about reality. I wish I had a better grasp on how to analyze and deconstruct the arguments.
Edit: I’m not even trying to hate on anyone. I am just venting.
The basic reason seems more or less that that so much media - from headlines to blogs to twitter - is driven by paranoia, because paranoia sells, and there isn’t much that people in the rationalist cult know more than the average reader
Reading a lot of Wikipedia is a poor substitute for knowing anything, in spite of the pretences of rationalism
Blame the UR-Rationalist. HP Lovecraft.
E: no I have no idea.
I don’t have a fully thought out answer here, but I suspect that being super self absorbed is a huge portion of it.
Because they’re cowards who are driven by fear, and one of their greatest fears is rightly being recognized for how dimwitted and mediocre they are.
If people that they think are smarter than themselves are saying how scary AI is/can be, well then they’re going to think themselves into being scared about it, too.
I don’t think the fear and cowardice is unique to rationalists, theirs is just another brand of the conservative white suburbanite fear that is endemic to much of the US.
I think, accepting LessWrong and SSC texts requires a lot of sloppiness in one’s critical thinking ability. When one cannot filter out good ideas from bad ones, but wants to be a profound meta-philosopher, their stream of thought can lead to interesting places.
I know the “wholesome” award doesn’t make any sense but it was the free award that I had, and I need to congratulate you for the hilarious title and the great points in the following discussion