Yeah, it'd be awesome if fallible human beings with imperfect knowledge were tortured for eternity because of temporary wrongs. BTW my name is Jehovah.
What would the alignment be of an AI that did everything Greater Yud does? Pretending to work on humanity-saving research while actually solely motivated to accrue and maintain a legion of sychophants and disciples to boost them online?
We’re already in AI takeoff. The “AI” is just running on human minds
right now.
So yes, finally they get how corporations (with their goals of
‘profit uber alles’) are an inhuman monster. Right?
What kind of thing is wokism? Or Communism? What kind of thing was
Naziism in WWII? Or the flat Earth conspiracy movement? Q Anon?
oh no.
E: also, look at all these people possessed by hyperreal ideas.
r/selfawarewolves
Most folk can’t orient to AI risk without going nuts or numb or
splitting out gibberish platitudes.
Come on… projection!
(Like holy shit, I think the point they might be trying to make isn’t
even that bad (if you assume all forms of organizations are likely to
get anti human goals but pro their own survival goals and you worry
about unfriendly AGI you should focus on more than just building
friendly AGI because you need an organization to build an AGI (their fix
is to work on Rationality then but imho this is just as flawed as this
is just another organization)) but ow god it is so badly written (The
whole post also reads as a parody, so this is you focusing on
rationality aye, how do you think it is going?)).
First part of it I was thinking "yes, this is what everyone else noticed immediately and why we make fun of the AI-risk nuts," but he never quite makes it there.
And \*of course\* the first example of an x-risk he gives is "wokeism".
Not just weird jargon, but also weird mystical ideas about sentient thought forms. (Which yes, that can be a funny metaphor to use, but it doesn't make things clearer, and well I worry they forgot the 'its a metaphor' part (and it can also be used to sneak in unstated assumptions).
They have gone so far into Rationalism the 'rationality' counter overflowed. At least the Sandman comics were well written.
(My remark about how organizations want to survive also isnt totally correct as there are also organizations who would (due to their setup or other reasons) end if nothing happened but because people are invested in the idea that the organization represents which makes it survive (or well it causes people to do things which makes it survive), but this all has noting to do with eregores. And this weird talk just makes the already complex subject even harder to grasp)
The comments are also full of some of the dumbest takes as well:
At some point (maybe from the beginning?), humans forgot the raison
d’etre of capitalism — encourage people to work towards the greater good
in a scalable way. It’s a huge system that has fallen prey to Goodhart’s
Law, where a bunch of Powergamers have switched from “I should produce
the best product in order to sell the most” to “I should alter the
customer‘s mindset so that they want my (maybe inferior) product”. And
the tragedy of the commons has forced everyone to follow suit.
Mind boggling that anyone could say this with a straight face.
I came here with the exact quote in my clipboard.
This take is so incredibly naive that it's amazing. I would have thought that this was an unfair charicature of a libertarian if I hadn't seen the comment myself.
Could you elaborate on why you find the take dumb? Is it the part that assigns raison d'etre to capitalism, or the part about what went wrong with it?
I apologize. I am not presently as anti-capitalist as other people on this sub, so I genuinely wish to understand.
Seeing capitalism as something that was corrupted, and not something which always was flawed is pretty weird.
Esp considering stuff like robber barons.
a) I’m not particularly anti-capitalist, but I still think it’s dumb. Markets exist to provide profit to suppliers while providing goods to consumers. That’s it.
b) To the extent the “better mousetrap” argument applies, it applies not specifically to capitalism, but to market based economies in general.
c) Even theoretically pure market economies don’t produce the highest quality goods. They produce a price for a good where the supply and demand curves meet. If there is more demand for cheap goods than high-quality goods, more of the cheap goods will be sold at a lower marginal profit.
d) Any consumer good that is essentially a commodity essentially only increases profit by lowering cost of production or increasing demand.
e) A consumer good that is actually indistinguishable from others of the same kind can only compete with the others based on marketing.
f) You can’t make Budweiser much cheaper, so you better market the hell out of it.
g) In a world where the market presents relatively low barrier to entry for a large number of competitors for what are essentially commodified consumer goods, of course the vast majority of competition is through marketing.
h) So the “better mousetrap” argument really only applies when you have figured out how to make a higher quality good, which must also be perceivable as higher quality, for the same price as lower quality goods, at scale.
The idea that capitalism has a _raison d'etre_ is sufficient. "Capitalism" is an after-the-fact label applied to a set of closely related social and political systems. It has tendencies but not a literal reason for existing. You don't even have to be anti-capitalist to recognize that (but it helps!).
If you read a music review that started with "When Nirvana invented grunge ...", you'd stop reading.
But capitalism does exist for a reason: it's yet another social/legal system for transferring wealth from creators to owners. The ridiculous bit of the quoted comment is not that they think capitalism has a purpose, but that they think it is or ever was to "encourage people to work towards the greater good"
Sure, that bit's also ridiculous.
Maybe I'm splitting hairs here, but it's Reddit, so hey: I think there's a useful distinction between "capitalism arose because of these historical / social / material conditions" and "capitalism has a reason for existing." In the quoted post's context, it's clear that the author thinks capitalism was invented in a lab or an academy or an elite forum and then distributed to the masses. That is empirically not the case.
> it's clear that the author thinks capitalism was invented in a lab or an academy or an elite forum and then distributed to the masses
I love this sub. Maybe we can improve capitalism DNA via iterative gain of function if we do take it to the lab, so never say never!
But you can always say that the reason something widespread exists is that no alternative emerged to replace it. Then you can speculate as to what are the reasons that no alternative emerged, and even narrow it down to one reason that seems overwhelmingly important. Or debunk those theories if they are wrong or too simplistic. But to dismiss any reason assignment as laughable seems odd.
It seems you're not attacking "Nirvana invented grunge" but rather "Grunge emerged for a reason", and the latter seems much harder to attack.
Let me continue with the grunge analogy for a bit.
What we call "grunge" is a label that was applied by critics and publicists *after the fact* to an existing body of work: the Sub Pop sound, the Seattle college radio scene of the late 80s, etc. None of the acts we associate most closely with grunge - Nirvana, Pearl Jam, Alice in Chains, etc - set out to create a new sub-category of rock 'n roll. They just wanted to sound like Sonic Youth's late period or Gang of Four at their best. They wanted the power chords of stadium rock and the dissonance of post-punk hardcore.
So while one could say "grunge emerged for a reason" - as a contrast to the glam rock of the mid-80s - that's different from saying it has a "reason for being" (a *raison d'etre*). It's not like Krist Novoselic, Eddie Vedder, and Cris Cornell got together over coffee and said, "yes, let's revolutionize rock 'n roll."
Calling the social and material factors that caused a cultural change a "reason" is - for me, and I may be hair-splitting - a step too far. It implies to me either that there's a mastermind behind it, or that social and material factors have an intelligence to them.
To bring it back to the original item: for me, there's a difference between saying something like:
"Capitalism exists because the middle class of Europe amassed enough wealth to exert political power"
and
"Capitalism exists to encourage people to work toward the greater good in a scalable way"
The former is a debatable claim based on social and empirical forces; the latter is post hoc rationalization, *even if it were true*.
Without diving headlong into the history of capitalism, I think a reasonable baseline definition is a system focused on free exchange of goods and services; from that comes the Adam Smith definition of a marketplace with many buyers and many sellers whereby inefficiencies are weeded out.
Nothing about this definition or any practical application of it has ever focused on scalability or the greater good as object goals, tho they are generally included in the list of side effects in theory (as in practice we’ve seen repeatedly that capitalism builds towards monopoly and extraction).
It’s wild to me to see someone claim that it’s just “bad actors” who fall prey to a “tragedy of the commons” when it’s built into the very system that one would want people to buy your shit instead of someone else’s regardless of the quality of either product.
Yeah, that's definitely a book that I am glad I read in a structured setting. No shame if you can't get through it. I was just pointing out one of my personal peeves with the rationalists- the insistence that they are experts on various fields despite clearly only having consumed a very biased selection of the literature.
One of the most disappointing things about the rationalsphere is
that, for all their supposed brilliance, none of them can imagine a way
of getting people to change their behavior except to punish them in a
Christian-like Hell.
- The writing at LessWrong is starting to sound like Scientology – so
larded with made-up pseudotechnical terms that only people in the cult
can understand it.
- The idea that existing self-aggrandizing institutions or dynamics
might be the real risks and AI-risk talk is missing the real danger –
that’s an old one, except usually people recognize that capitalism is
the real paperclip maximizer. I collected a bunch of writings on this http://hyperphor.com/ammdi/AI-risk-%E2%89%A1-capitalism
- The idea that “wokism” is going to eat the world is a product of
the hysterical white-guy resentment that is the default politics of the
rationalist sphere, spearheaded by Scott Alexander. It’s also where
rationalism intersects with very vanilla right-wing Fox News propaganda
efforts. There is no reason at all to take such people seriously, they
are themselves cogs in a political machine, and must be defeated
politically through organization.
> - The writing at LessWrong is starting to sound like Scientology – so larded with made-up pseudotechnical terms that only people in the cult can understand it.
the above has held since it was still on Overcoming Bias
> usually people recognize that capitalism is the real paperclip maximizer
except Bostrom himself, because of course not
> usually people recognize that capitalism is the real paperclip maximizer
>
> except Bostrom himself, because of course not
*A haiku appears in the wild.*
hard agree, today‘s transhumanism reads so much like late antique neoplatonism.
Iamblichus‘ theurgy is imo just the same. the guy is arguably even a proto dialectical materialist.
Shoutout to the podcast "the secret history of western esotericism".
This seemed particularly egregious. I never know whether I should feel stupid for not understanding what what the shit they’re even talking about half the time, or smart for understanding that they are kind of bad at communicating it.
the whole anti-racism woke wave, etc. This is people getting
possessed.
like, I don’t have to even translate that “woke” literally means
“aware of systemic racism,” he knows this and that’s the precise thing
he hates about it, and is sure only memetic possession could
make someone not be racist.
I also highly recommend the comments for the doubling and tripling
down on the lesswrongism.
I for one, welcome the anti-racist superintelligence
So yes, finally they get how corporations (with their goals of ‘profit uber alles’) are an inhuman monster. Right?
oh no.
E: also, look at all these people possessed by hyperreal ideas. r/selfawarewolves
Come on… projection!
(Like holy shit, I think the point they might be trying to make isn’t even that bad (if you assume all forms of organizations are likely to get anti human goals but pro their own survival goals and you worry about unfriendly AGI you should focus on more than just building friendly AGI because you need an organization to build an AGI (their fix is to work on Rationality then but imho this is just as flawed as this is just another organization)) but ow god it is so badly written (The whole post also reads as a parody, so this is you focusing on rationality aye, how do you think it is going?)).
The rest of my argument is left as an exercise for the reader.
The basilisk is here and it grew up in SRS.
The comments are also full of some of the dumbest takes as well:
Mind boggling that anyone could say this with a straight face.
What is wokism?
The belief that an ordinary skillet is just insufficient.
One of the most disappointing things about the rationalsphere is that, for all their supposed brilliance, none of them can imagine a way of getting people to change their behavior except to punish them in a Christian-like Hell.
- The writing at LessWrong is starting to sound like Scientology – so larded with made-up pseudotechnical terms that only people in the cult can understand it.
- The idea that existing self-aggrandizing institutions or dynamics might be the real risks and AI-risk talk is missing the real danger – that’s an old one, except usually people recognize that capitalism is the real paperclip maximizer. I collected a bunch of writings on this http://hyperphor.com/ammdi/AI-risk-%E2%89%A1-capitalism
- The idea that “wokism” is going to eat the world is a product of the hysterical white-guy resentment that is the default politics of the rationalist sphere, spearheaded by Scott Alexander. It’s also where rationalism intersects with very vanilla right-wing Fox News propaganda efforts. There is no reason at all to take such people seriously, they are themselves cogs in a political machine, and must be defeated politically through organization.
Sounds to me like this guy’s talking about an egregore. Successfully reasoned so hard he reinvented a concept from occultism.
‘everything that’s scary and I dislike is artificial intelligence’
[deleted]
Anyone I disagree with is an NPC
Everything I don’t like is x-risk.
like, I don’t have to even translate that “woke” literally means “aware of systemic racism,” he knows this and that’s the precise thing he hates about it, and is sure only memetic possession could make someone not be racist.
I also highly recommend the comments for the doubling and tripling down on the lesswrongism.
“The sleep of reason produces monsters.”
“Hold my beer.”
I just think it’s objectively funny I’m seeing Tim Morton references on LessWrong
It’s just Dawkins recycled with contemporary targets.
Could have saved a lot of time by just explaining the original definition of “meme” to this guy