r/SneerClub archives
newest
bestest
longest
Why rationalists always end up scaring the shit out of themselves? (https://www.reddit.com/r/SneerClub/comments/lnnn5f/why_rationalists_always_end_up_scaring_the_shit/)
44

This is a pattern i have noticed about rationalists in general: they take certain scientific concepts and mix them with unproven assumptions to take everything to its most terrifying conclusion. I mean, between: “many worlds interpretation is true” and “there are agents in other branches making copies of you and you probably are a copy and you are going to simulated hell” there’s a lot of thinking going on. Why you think this is the case? Let’s do some good old armchair psychology here.

I don’t have what we would call a “good take” on this but uh

In a conversation the other day, I couldn’t help but remark on how a self-proclaimed rationalist community had invented (a) God, (b) prayer, (c) Satan, and most impressively (d) indulgences, all from first principles.

(Sorry, I meant “Benevolent AI,” “Acausal bargaining,” “Roko’s basilisk,” and “give some money to MIRI,” respectively. Sometimes my autocorrect acts up)

"Pascal's Wager, rephrased in a mere 30,000 words"
I apologize for such a brief letter, I didn’t have time to write a longer one.
Could probably work something in there about being reborn as a simulation. I'm not sure the Basilisk is Satan, seems more like a Demiurge to me.
Sure, we could get technical, I minored in religious studies, but the Basilisk clearly fits into the Satan role in the Rationalist AI theology's duplication of essentially Protestant norms. Part of what's hilarious about it is that if the Rationalists knew enough about religion to understand what a demiurge is, they might have deviated even a little from reinventing Christianity.
I'm actually curious whether it more closely matches Protestantism or Catholicism...indulgences are more of a Catholic thing - Luther's "95 Theses" were titled "Disputation on the Power and Efficacy of Indulgences". I was going by the conception of Satan as the "The Great Deceiver" which is influenced by Milton's (Puritan) portrayal, which I don't think matches the Basilisk's role, but ultimately I'm not familiar enough with theology to have an informed opinion.
The indulgences are definitely Catholic, you're right, but Catholicism is a lot more complicated than the average Protestant denomination. There's just too much other stuff to fit into the AI-Basilisk model. I mean, the Catholics have a large and devout cult of Mary. I can't imagine Rationalists having that kind of widespread respect for a woman.
It matches Calvinist predestination afaik
I dunno, it's pretty explicit that the basilisk is the *good* AI, who is unfortunately constrained to send you to Hell for your sins, to maximise the positive outcomes of the universal wave function. Though also the demon torturing you in Hell.
'tis the duality of AI
Ran into a transhumanist on Twitter the other day who believed that Christian Dominionists are on track to invent a virtual hell for the eternal torture of simulations, à la lain Banks' *Surface Detail*. Since this would represent a nigh infinite amount of suffering, then no countermeasure is too extreme for consideration. The transhumanist favored compulsory eugenics, which would hopefully eliminate the wanting-to-invent-virtual-hell gene before the Singularity came to pass.
> hopefully eliminate the wanting-to-invent-virtual-hell gene lol...that's my least favorite gene! Finally something I agree with them on.
At that point, why not try to eliminate the being-bad-enough-in-some-sense-to-warrant-eugenics gene, that should solve the issue by definition right? Just gotta figure out which one that is but i can't imagine that could take long, surely
Uh, that's actually a good plot for a short story
~~The rapture~~ The discovery of General AI On a more serious note, I am what we like to call a "godless heathen" (I'm told it's a technical term) and therefore would have to look up a Demiurge but I will accept your correction gladly!
As a christian and logician, I have long been tickled by this.
Very tangential: While it's not my field, I've always enjoyed talking to logicians & set theorists!
Don't forget the apocalypse/"singularity" and the afterlife/"cyronics"... They're so religious it's kind of hilarious.

the answer lies in their hatred for modern art. we shouldn’t take this just as an aesthetic revulsion. it shows a total loathing and terror for every element of the modern world, with its complexities like “women that don’t want to be talked down to” or “workers who want a fair wage” or “black people who don’t want to be killed by the police”. they could no more imagine a world without fear than they could imagine that star wars is actually extremely popular and liking it doesn’t make them special

They don't hate all aspects of the modern world, they're big fans of technology for example. Fascism has been characterized as ["reactionary modernism"](https://en.m.wikipedia.org/wiki/Reactionary_modernism) , and I think a similar category applies here, as been theorized by [Richard Barbrook](http://www.imaginaryfutures.net/2007/04/17/cyber-communism-how-the-americans-are-superseding-capitalism-in-cyberspace/).
that's fair - they want the products of modernism but not the reality that produces them - a child's view of the world where ideology is as invisible as santa claus, bringing them flashy new web apps to rook the poor every day

Scaring people is a way to get people to take you seriously and give you money and attention? 🤷‍♂️

I call it fright-selling.

it’s been scary campfire stories for amateur philosophers for a while

Infohazard memes for terrified teens

Making fun of this phenomenon goes back to Jonathan Swift > Most of them, and especially those who deal in the astronomical part, have great faith in judicial astrology, although they are ashamed to own it publickly. But what I chiefly admired, and thought altogether unaccountable, was the strong disposition I observed in them, towards news and politicks, perpetually inquiring into publick affairs, giving their judgments in matters of state, and passionately disputing every inch of a party opinion. I have indeed observed the same disposition, among most of the mathematicians I have known in Europe, although I could never discover the least analogy between the two sciences; unless those people suppose, that because the smallest circle has as many degrees as the largest, therefore the regulation and management of the world, require no more abilities, than the handling and turning of a globe: but I rather take this quality to spring from a very common infirmity of human nature, inclining us to be most curious and conceited, in matters where we have least concern, and for which we are least adapted by study or nature.

These people are under continual disquietudes, never enjoying a minute’s peace of mind; and their disturbances proceed from causes, which very little affect the rest of mortals. Their apprehensions arise from several changes they dread in the celestial bodies. For instance, that the earth, by the continual approaches of the sun towards it, must, in course of time, be absorbed, or swallowed up. That the face of the sun, will, by degrees, be encrusted with its own effluvia, and give no more light to the world. That the earth, very narrowly escaped a brush from the tail of the last comet, which would have infallibly reduced it to ashes; and that the next, which they have calculated for one and thirty years hence, will probably destroy us. For, if in its perihelion, it should approach within a certain degree of the sun, (as by their calculations they have reason to dread) it will receive a degree of heat ten thousand times more intense, than that of red hot glowing iron; and, in its absence from the sun, carry a blazing tail ten hundred thousand and fourteen miles long; through which if the earth should pass at the distance of one hundred thousand miles from the nucleus, or main body of the comet, it must in its passage be set on fire, and reduced to ashes. That the sun, daily spending its rays without any nutriment to supply them, will at last be wholly consumed and annihilated; which must be attended with the destruction of this earth, and of all the planets that receive their light from it[2].

They are so perpetually alarmed with the apprehensions of these, and the like impending dangers, that they can neither sleep quietly in their beds, nor have any relish for the common pleasures and amusements of life. When they meet an acquaintance in the morning, the first question is about the sun’s health, how he looked at his setting and rising, and what hopes they have to avoid the stroke of the approaching comet. This conversation they are apt to run into with the same temper, that boys discover, in delighting to hear terrible stories of spirits and hobgoblins, which they greedily listen to, and dare not go to bed for fear.

My impression is that one of the pillars of rationalism is to think “logically” through everything and reason out conclusions even (or more accurately, especially) when they are counter-intuitive. I think it can be hard to demonstrate that you are doing this because few real problems are simple enough to see the logical solution outside of thought experiments and other contrived scenarios. The thought leaders do plenty of the latter, so how can the average user demonstrate how rational they are? One way is to express your opinion, but it’s an insular community so plenty of others are doing the same thing on similar topics since that’s usually what drew people in in the first place.

Enter your scenarios. Someone thinks up a scary conclusion based on the shared group assumptions. Expressing that you find these scenarios worrisome or frightening demonstrates that you have internalized the lessons and are applying rationality. It essentially is a declaration of faith. I think this works because most of these scenarios are only scary to followers. If you tell somebody on the street they should worry about a far future simulation of themselves as much as they do about their own person they’d tell you that was stupid and move on. I think everybody’s gut tells them that this doesn’t make sense, and so it demonstrates adherence to the tenets to ignore your gut and say “yes even though i don’t feel like this is true i think it is true, therefore it concerns me.”

Also,

they take certain scientific concepts and mix them with unproven assumptions

is an accurate general description of the singularity/AI as X-threat sphere IMO

O u want armchair psychology? As someone who’s been rat-adj, I’ve been preparing for this my whole life.

afaict rationalists systematize a lot (to use the rationalist termology for it), which means they generally enjoy and do best with things that have simple rules and complex properties that they can safely explore without interfacing with stuff outside that system. With math and agents and quantum mechanics* (“many worlds interpretation is true” is ostensibly an application of Occam’s Razor). From these things, rationalists then extrapolate really hard and, if you extrapolate really hard, things inevitably become scary.

The things the rationalists fear resemble the rationalists themselves. The rationalist extrapolates really hard to get radical conclusions and, in turn, fears an AGI that optimizes really hard to get terrifying results. People often invents gods in their own image. The rationalists simply did the same thing and invented a god in the rationalist image.

But what do I know? ^Above is just assuming some basic properties about rationalists and drawing arbitrarily detailed conclusions about them. Reality isn’t simple rules with complex properties, its complex rules with complex properties. I’m gonna stop talking now before I get shoved in a locker

*kind of, physics is actually a mess but its not as much of a mess as lots of other things in reality

>The things the rationalists fear resemble the rationalists themselves. The rationalist extrapolates really hard to get radical conclusions and, in turn, fears an AGI that optimizes really hard to get terrifying results. People often invents gods in their own image. The rationalists simply did the same thing and invented a god in the rationalist image. This reminded me of the comments /u/metachor wrote [in 2019](https://www.reddit.com/r/SneerClub/comments/c23b6e/weve_been_saying/eripups/): >I have a theory that what is missing is empathy, and particularly cognitive empathy (as opposed to affective empathy). Like rationality, empathy is a higher-order cognitive skill that needs to be actively developed and practiced; but the basic techniques and skills to do so are not widely understood and practiced, at least in the community of people signified in this post. > >\[...\] > >The skill and practice of recognizing that another person has their own cognitions (instead of simply projecting your own on to them), and actively using perspective-taking in trying to understand them. > >Going even further, I would define empathy as fundamentally recognizing another person as being a human being (and accepting everything that logically entails, knowing you yourself to be human). By the way, thank you /u/metachor – this was the first time I learned this distinction and soon I realized that I had horrible cognitive empathy too. This awareness freed me from a lot of frustration caused by projecting my thinking onto other people.
You are welcome! I’m glad that distinction helped you make a connection with yourself regarding your feelings about your experiences with other people. Empathy is a super power for getting along with other people, and for not treating them like a means to your ends, or like a black box that only exists to get in your way. Many times when I bring up empathy in more rationalist-aligned spaces I get told that it’s actually a problem, because it prevents people from “seeing the greater good”! Someone in the effective altruism space even wrote a book to that effect; “Against Empathy: The Case for Rational Compassion”. I worry that seeing the greater good at the expense of seeing other individuals as actual living human beings, leads to many dark conclusions getting rationalized into noble acts.
Totally legit worry not only because of the recorded history of atrocities done in the name of the "greater good", but also because that "greater good" is *always* hypothetical while the immediate suffering of others is always real. So, it is not just using others as the means, but actually gambling their well-being or even lives.
This is very spot-on. It's curious how they always talk about their "mental models" of other people, yet fail so miserably at understanding the complexity and diversity of human thinking.
I think the concept of having a “mental model” of another person is anti-empathic because you are explicitly shadowing any real observations and experiences of that person as a living human being with your preconceived analysis of what you think their cognitions and drives are. It’s an exercise in willed premature thought termination that absolves the “modeler” of doing any of the work of actually acknowledging other human beings as such.
i always think if it as the deux ex machina fallacy mostly because i went to school with film bros and they have similar tendencies to weave narrower and narrower scopes for themselves because they desperately want to be clever but are so scared of writing themselves into a corner. so they either have to confront some parts of humanity/themselves that make them very uncomfortable, or christopher nolan them away

I can relate to this impulse, although it is only particularly prominent when I am feeling isolated and alone.

The user Mydradek posted here a great excerpt that highlights the sorts of things that I get worked up about. I am actually really embarrassed by the things I obsess over, including many of discussions that rationalists gravitate towards. I don’t think much of it is profound or fulfilling, rather being an impediment to thinking about more important things. A part of me thinks it is just a form of escapism from actual existential fears, like worrying about climate change and feeling helpless to stop it. The brain ruminates on the dread long enough that it is no longer immediately aware of the source of the anxiety, rather just focused on solving any problem it can to get a release. This is the only way I can explain my behavior to indulge in this thinking, because the only time such thinking emerges is when I don’t have meaningful connections or feelings of security, and I become lost in that anxiety and dissociate. Everything feels like a dream and I project my ruminations onto the real world.

I have a weak background in philosophy and logical reasoning, but I do see similarities in how some creationists and rationalists find ways to utilize probabilities to come to confusing conclusions about reality. I wish I had a better grasp on how to analyze and deconstruct the arguments.

Edit: I’m not even trying to hate on anyone. I am just venting.

I mean if it's any sympathy, I also get worked up about abstruse things.
Not on here, bubs.
With LessWrong, a lot of it is very smart and idealistic young fellow who are somewhat on the spectrum and may have slight OCD. This makes them susceptible to basilisk-like ideas, or generally just burning out on scrupulosity - averting the unfriendly AI is after all THE MOST IMPORTANT THING IN THE WORLD. There were a number of examples of this on LessWrong around 2010-2012, who I won't name here.
To be fair, it requires an exceptional iq to understand rationalist thought

The basic reason seems more or less that that so much media - from headlines to blogs to twitter - is driven by paranoia, because paranoia sells, and there isn’t much that people in the rationalist cult know more than the average reader

Reading a lot of Wikipedia is a poor substitute for knowing anything, in spite of the pretences of rationalism

Blame the UR-Rationalist. HP Lovecraft.

E: no I have no idea.

I don’t have a fully thought out answer here, but I suspect that being super self absorbed is a huge portion of it.

Because they’re cowards who are driven by fear, and one of their greatest fears is rightly being recognized for how dimwitted and mediocre they are.

If people that they think are smarter than themselves are saying how scary AI is/can be, well then they’re going to think themselves into being scared about it, too.

I don’t think the fear and cowardice is unique to rationalists, theirs is just another brand of the conservative white suburbanite fear that is endemic to much of the US.

I think, accepting LessWrong and SSC texts requires a lot of sloppiness in one’s critical thinking ability. When one cannot filter out good ideas from bad ones, but wants to be a profound meta-philosopher, their stream of thought can lead to interesting places.

I know the “wholesome” award doesn’t make any sense but it was the free award that I had, and I need to congratulate you for the hilarious title and the great points in the following discussion