r/SneerClub archives
newest
bestest
longest
Sneer Theory: The rationalist obsession over the AI control problem is an expression of bourgeois anxiety. (https://www.reddit.com/r/SneerClub/comments/cft533/sneer_theory_the_rationalist_obsession_over_the/)
48

For a while I’ve wondered why the bourgeois has provided so much material support for rationalists (Thiel funding MIRI, Musk helping promote their ideas, etc).

^(I hope this isn’t too serious for this subreddit, but then again the sidebar does literally reference sneering at the bourgeois. Also sneer theory rhymes with queer theory which is 💯)

Anyway, a lot of the more standard reactionary views espoused (“meritocracy”, HBD, trickle down healthcare research) strongly benefit the ruling class, but why worry so much about humanity losing control over AI?

And then it hit me - your average, white, upper-middle class rationalist projects themselves (and their bourgeois heros) onto humanity. Everyone else in their eyes is too stupid to count as human, and are basically just drones existing for their benefit.

^(Fun fact: The word robot originally meant forced laborer in Czech until the play R.U.R. used it)

Thus the control problem isn’t actually about “reducing existential risk” (if that was the concern, they’d be focusing on climate change), it’s about the bourgeois fearing an uprising of their labor force and losing control over the means of production.

Examining Eliezer Yudkowsky’s proposed solution to the control problem, Coherent Extrapolated Volition confirms this analogy - his paper on this topic (published thru MIRI) is full of concerns about “tyranny of the extrapolated majority” ruining “humanity” and even states that “A minor, muddled preference of 60% of humanity might be countered by a strong, unmuddled preference of 10% of humanity” (guess who that galaxy-brained 10% will be?).

I’ll leave the question of why the Elon “union-busting and workplace surveillance” Musk is pushing for brain implants to prevent “AI” revolution as an exercise for the reader.

[deleted]

They "care" about climate change in a very limited way, it's fuel for their delusions of grandeur and projects of self-enrichment rather than a crisis that should inspire collective action and a restructuring of our societies, so I wouldn't call that "caring"
Por que no los dos?
[deleted]
You can get some of them to admit that climate change is one of the most important problems facing humanity but they’ll still dedicate most of their time ranting about how feminism has gone too far.
Are you saying that the amount of time discussing a topic tells you how important people consider it? I would bet that even though you spend more time ranting about how dumb rationalists are rather than discussing climate change you still think climate change is a much bigger issue. If you want to know how important they think various issues are you should actually ask.
Lol, so the premise is that I think rationalists are really dumb, but for some reason I am interested in knowing their opinions on things, and furthermore it has never occurred to me to ask. How exactly did you board this train of thought? Especially after reading >you can get them to admit that... Which indicates that I have in fact queried rationalists on this before and that doing so was literally the first consideration at hand.
Correct me if I'm wrong but what you said can be paraphrased to say "Some rationalists aknowledge that climate change is very important, however because they spend more time discussing other topics they are therefore either being untruthful or don't care enough to put in the effort". What I was trying to say is that people don't always talk about the "important" things which was the basis for your comment. For example you don't care what rationalists think but then still spend more time talking about them rather than talking about climate change. I think it's only human that communities will discuss social issues more than is warranted simply because it is engaging in a way that something like climate change is not. That's probably why we're still both here and not studying to solve climate change right now.
> Correct me if I’m wrong by george he’s got it > For example you don’t care what rationalists think why do you just make stuff up?
I misinterpreted what you said to be sarcasm because you seemed to think that they didn't actually care about climate change due to their discussions about feminism. Someone who thought that probably didn't care what they thought. I think what you actually meant was that they really do mean what they say but you are annoyed that they spend so much time on these other things which you don't care about. Commenters on the sneer club are rarely so willing to believe that rationalists actually care about real things. That context probably changed how I read your comment. It was an honest misreading of what you said- my bad.
We can read themotte, dipshit. The fact that there are lots of rationalists who are climate change deniers is easily observable. Spend less time assuming to know what others must REALLY think and simply ~not be aware of~. Like your entire subculture, you're incredibly bad at it. Since you got so stuck on this dumb idea of "sneerclub is saying that anyone who talks about topics other than climate change doesn't care about it!", have a helpful explanation: if anyone expresses leftist, feminist, or anti-racist opinions in themotte or similar festering rationalist spaces, they will 100% of the time be mobbed by frothing incels. If anyone expresses climate change denialism there, they'll get some upvotes and praise and it won't draw anything like the same level of criticism.
That has not been my experience at all. I think that what you are latching onto is that "rationalists"- whatever that means- have a culture of not harshly criticizing ideas. At least not on a personal level. If someone says something that sounds like it could be viewed as racist by someone you expect people to jump on that person with personal attacks. These communities will rarely do that simply because the entire philosophy behind them is that people should consider things even if they don't initially align with our views. When they fail to make it personal you take that as a sign of agreement because most every other community would have attacked them. If you want to know what rationalist communities think on average you need to use things like the slatestarcodex poll where they have actually been asked. You can't rely on the normal signaling tools like personal attacks because the entire goal of the community is to do that as little as possible- even when they disagree. Short summary of a 2019 poll of 13,000 SSC readers. *Identity:* Only about half of them say they would identify with the blog LessWrong and 57% say they do not identify as 'EA'- Effective Altruists. *Political Leaning:* 31% report to be social democrats, 29% as liberal, 21% libertarian, and the rest small amounts. These are based on US categories because most people there are american. *Political Party:* 31% are part of the democrat party, 35% are independent, and 9% republic. Most of the rest were not american. *CLIMATE CHANGE:* Hard to summarize the graph because it's a 1-5 likert scale, but basically they strongly support the importance of climate change requiring action. *Immigration:* Nearly all of them want immigration to be more open. *Feminism:* 75% of them gave a positive or neutral response to it and 18% gave weak negative responded. *Trump:* 81% of them gave negative responses to trump. 10% were neutral Anonymous surveys like these are how you actually figure out what communities believe. If you rely on an outside community which literally defines itself by disliking that particular group to tell you what they "actually" believe then you're in for a bump ride. Here's the link to the survey results https://slatestarcodex.com/blog_images/2019%20SSC%20Survey.html

I think you’re right actually. It’s the tension between excitement about reducing labor costs and discarding their dependence on the masses vs. the anxiety about the potential consequences that can’t be foreseen, just as the bourgeoisie hadn’t foreseen climate change. Basically the inherent contradictions of capitalism are understood on some level and this is a disaster scenario where they get heightened to an absurd degree. Mark Fisher said it’s easier to imagine the end of the world than the end of capitalism, and of course those who suffer from this myopia the most are the people for whom it’s also more profitable. All exacerbated by the fact that they have acute sci-fi nerd brain poisoning.

project your psychoanalysis onto the rationalists you say?

These anxieties strike me as clearly spiritual and numinous, not material. “Runaway Superintelligence” is neognostic eschatology.

I agree that it is eschatological, and from a Marxist perspective that only reinforces my point as spirituality tends to reflect the interests and consciousness of the ruling class. For a less esoteric example, consider the Protestant Work Ethic, a founding component of what we'd call the "American ideology" - if you rely on your own hard work and persevere, anything is possible! Obvious benefit to someone trying to get the most out of their workers. Now I'm not saying these ideologies are installed in a conscious, conspiratorial manner, but they do personally appeal to members of the bourgeois who may be able to influence newspapers, printing presses, etc.
god in heaven if you're going to use the phrase "neo-gnostic" but a hyphen in there, "neog-nostic" is not a subvocalisation I ever wanted in my brain

Many of them probably justify their privilege with their „superior“ intelligence. Of course, anything that is (presumably) even more intelligent than them is perceived as a threat.

People like to think their job is important, and what’s more important than saving the world from Skynet?

this is actually brilliant- and also, extrapolating further, presumably an ethical system which would effectively keep AI in line would also be an ethical system which would keep workers docile, no? even without any fancy brain implants. in this way MIRI can be seen as the project of crowdsourcing the development of the ideological superstructure.

> and also, extrapolating further pls no
no_fun_allowed_robot.png

No, it’s actually an extension of an obsession with bdsm. Read any rat-adjacent fanfic to see this for yourself.

Looks… about right to me.

It’s a fear that capital itself (which A.I. would be at this historical juncture) will consume and destroy the capitalists.

I’ll leave the question of why the Elon “union-busting and workplace surveillance” Musk is pushing for brain implants to prevent “AI” revolution as an exercise for the reader.

Holy shit really? Is that a thing that pest advocates for now?