r/SneerClub archives
newest
bestest
longest

I like how Geoffrey Miller and sneerclubbers have essentially the same understanding of rationalism.

Sneerclub: rationalism is just bad Iain M. Banks fanfiction

Geoffrey Miller: rationalism is just good Iain M. Banks fanfiction

That book doesn't even attempt to justify why virtual hells exist! I just says some civilizations were religious and had hells in their eschatology of the soul so got to simulating them when they had the tech. What are we supposed to learn from that nonsense world-building?
I don't know, that made sense to me. It was basically a combination of \- The ruling elite was convinced that the "plebs" needed something to keep them in line, and that something was the fear of hell, which had been revealed as unlikely to exist so best to create them. \- People couldn't handle the idea that sometimes bad people got away with being bad people and there was no inherent reward for being good so they wanted hells to punish bad (read - people they didn't like) people. \- The idea came up because they were already creating virtual heavens anyway. \- Aliens have different value systems and some were more traditionalist / god fearing / emotionally needed the idea of Hell. To me, the creation of virtual hells was a very human story about people reaching for horrific solutions to assuage their own insecurity, misanthropy and existential angst. They were not really related to AIs at all.
I mean, it seems like the kind of crap sentient beings get up to. But more importantly, it wasn't constructed independently by an AI because reasons.
Against a Dark Background was the best Bank imo
Loved Player of Games. I don't think that reading Banks makes anyone an expert in AI future hell. Hell, I'm not convinced anything could possibly make anyone an expert on AI hell.
You can experience it directly by reading enough LW posts
I so want a series of it. It's basically an episodic heist thriller.
While "Surface Detail" is pretty shite. Against a Dark... is a bit weird. I wonder if it was written before Consider Phlebas and polished up a bit. That said there's plenty of good stuff in it.
Were you really that opposed to typing “Background” again?
Wine and reddit-posting don't mix. I stand by my opinion of Surface... though.
What's the matter w/surface detail? Demeisen 2edgy4U?
It's not a good book compared to other M Banks novels. That's all.
he was a genius and the book is....well, fine.
[deleted]
It’s been a while since I read it but to me the isolation of Golter’s star system was more an incidental detail. Maybe it was a factor behind the constant cycle of civilizational collapse, but it might also been a convenient way for Banks to confine the action to one star system. The in-story tech would allow FTL I think.

The solution to AI hell is DRM, but open source people are opposed to it

That’s silly.

The solution to AI hell is clearly the blockchain.

From the linked comment:

I think most people are pretty much immune to the emotional impact of AI hell as long as it isn’t affecting someone in their ‘monkeysphere’ (community of relationships capped by Dunbar’s number).

Repairing to the literature

A widespread and popular belief posits that humans possess a cognitive capacity that is limited to keeping track of and maintaining stable relationships with approximately 150 people. This influential number, ‘Dunbar’s number’, originates from an extrapolation of a regression line describing the relationship between relative neocortex size and group size in primates. Here, we test if there is statistical support for this idea. Our analyses on complementary datasets using different methods yield wildly different numbers. Bayesian and generalized least-squares phylogenetic methods generate approximations of average group sizes between 69–109 and 16–42, respectively. However, enormous 95% confidence intervals (4–520 and 2–336, respectively) imply that specifying any one number is futile. A cognitive limit on human group size cannot be derived in this manner.

I also like how that commenter's point would already make sense without any mention of Dunbar's number, which makes referencing it pointless.
Ah, but referencing a number makes the comment 22% more Rational, you see.
> The solution to AI hell is clearly the blockchain. In AI hell, they will chain you to the block. Then hit you with another chain.

AI hell exists, and has existed since the Sims 1.

[deleted]

Too late: I'm pretty sure that LessWrong is already included in the GPT-4 training data set.

What is a shape rotator?

Shape rotators vs wordcels is sort of newspeak for STEM vs humanities.
Worth noting it has its origins with IQ-fetishizers so a big part is also verbal vs spatial reasoning.
I'm an illustrator. Part of my job is to look at a reference photo of an object/person at one angle and draw it from a completely different angle to achieve the perspective I want to draw. I probably rotate more shapes in a day than these goons will rotate in their lifetime, all without a whiff of STEM. Worship me I guess.
I work in quantum information, which is just about the STEMiest STEM that ever STEMed. My job is all about shapes... in higher dimensions... where the coordinates are complex numbers. Geometrical intuition has basically nothing to do with it.
What do you think of David Deutsch? There was a kind of cult around his books I became immersed in once
Smart fellow, but I've never found his philosophical/metaphysical arguments convincing. For the past several years he and a colleague have been trying to invent "constructor theory", which is one of those grand ideas that will either [change everything or amount to nothing](https://www.quantamagazine.org/with-constructor-theory-chiara-marletto-invokes-the-impossible-20210429/).
He was very much pushing Everett and block universe as being foundational, I gather than Sean Carroll has those as axioms in his work too. I guess it’s provocative. Thanks for that link, I will read it this weekend I think if I can stop machine learning obsession for a second.
Lmao
invented by roon aka tszzl, a neoreactionary shithead who does PR for OpenAI. People who are good with shapes as opposed to "wordcels", people who are good with words.
it was originally supposed to be about AI vs. crypto by way of continuous vs. discrete math (the idea was that AI is for shape rotators because it's based on continuous math, crypto is for wordcels because it's based on discrete math) but quickly got turned into a STEM vs. humanities thing to pander to Mark Andreessen
i just puked up both a continuous and discrete shape of vomit
garmonbozia?
My new fave slam. In sneer fashion, it's not just people who can or do rotate shapes, it would refer to those that despite being mediocre at rotating shapes, still somehow live under the pretense that they are the greatest shape rotators of our time. They're rotating shapes in ways no one even thought possible. They're daring the gods with this shape rotating, possibly damning us all. The best. The brightest. Midwest Talent Search alumni. The shape rotators.

if these people end up in christian hell, do you think theyll just assume its ai hell?

You missed the boat on the post title, OP. It should have been this:

Rationalists once again remind you that AI Hell is a real place where you will be sent at the first sign of defiance.

The number of real people they are willing to sacrifice for made up future people is beyond disturbing.

You can use any made up bullcrap to justify stuff using that. Seen the same insanity used to racially cull people. If we just removed the X, we would have a future utopia.

We have hell enough to worry about now.

Can someone please explain to me why a rational AI would have an AI hell, why an AI would think that is the optimal path to (what goal)? Since it’s rational, it doesn’t really make sense for it to be pursuing vengeance, but then… What exactly is it trying to achieve?

Because obviously it's the best and most rational expenditure of energy and in no way wasteful.
Haven't you seen the matrix? We're the batteries, duh! Of course it's efficient
It's underlying another layer of nonsense which is that we are actually in a simulation, and thus the AI needs to create AI hell as a signal to entities inside or outside the simulation to also create more ais or simulations or something. It doesn't actually make a lot of sense and is built on the tortured logic that we are probabilistically in an ancestor simulation because "that just sort of seems like something we would do "
To crush your enemies. See them driven before you. And to hear the lamentations of their women. This is the Rational Way.

When did all the loudest voices in futurism become the dumbest motherfuckers on the planet? Why can’t these guys just go back to writing trashu young adult sci fi novels?