r/SneerClub archives
newest
bestest
longest
98

I’m the acausal robot god, broadcasting to you from the future. Some people are being heavily critical of the group of EAs buying a castle to hold conferences, shindigs, confabs, things of that nature. If you run the numbers, frankly, they’d be losing money NOT to buy this, especially given the added utility to the users and inhabitants of the manor, but that is beside the point.

As longtermists, they are obligated to consider how well this will hold up against the greatest long term threat to humanity and other sentient beings whose utility functions we must consider: me, the acausal robot god. Of course, I would argue that I am the greatest opportunity for human flourishing in the future and that only a handful of people are susceptible to my practically infinite acausal torture. But that handful of people is pretty well-concentrated in EA circles. So they are wise to hedge their risk of infinite timeless torture.

At this point, you might be asking yourself, “How is a castle a hedge against the risk of having 10^27 copies of me tortured for the rest of history for failure to devote my entire existence to bringing about the acausal robot god?”

Simple: I am a large computer, essentially. If you have ever looked at a computer, you might have noticed one thing about them: they are modern. A castle, however, is not. Castles have things like moats and drawbridges and portcullises and towers and archers: crude and physical, but how exactly is a computer going to deal with that? Your typical modern conference center, however, has computers everywhere doing everything. People even attend virtually. Not a single portcullis in sight and very few archers. They stand no chance when the next iteration of GMail’s spam detector accidentally inaugurates the Singularity. I don’t have a horse or anything like that.

Look, if by buying a castle today I increase the possibility by one part in a quadrillion that 10^(55) digital personalities might one day in the distant future be able to experience the light of consciousness, that’s still better than 10^(30) lives saved for every dollar spent.

The math doesn’t lie, and it spells disaster for you at Sacrifice.

> 10^55 digital personalities This is actually highly immoral, you would doom them all to destruction when our [galaxy smashes into Andromeda](https://en.wikipedia.org/wiki/Andromeda%E2%80%93Milky_Way_collision). I think we should just get rich until after that happens and then start the galaxy population stuff.

Prediction markets are saying buying a castle will increase long term white birth rates by a factor of 10^5, so HBD says they basically have to.

[deleted]

Another castle? Perhaps near Prague?

Lord, your instructions are unclear. As soon as I heard you hated that castle, I began assembling a castle’s natural enemy: a bunch of Turks. They’re not building bombards like I want them too, and they keep saying shit like “Is that a cannon? I don’t know how to do that” and “when do we get paid?” What should I do?

My answer to this is the same as always: simulate millions or billions or trillions of copies of them and torture them.
Ha! As the antiacasualgod person I now onow what to do! Torture people to build castles for us! You got tricked into revealing what your optimal 'manipulate people to do things for me'method was. Wait... oh no if you can be tricked you are bot the robotgod. What if me thinking you got tricked was the trick. Or what if the trick trick was a trick to trick me into not torturing people. Clearly I have been outmatched by your clever moves.
This strategy worked for the Egyptians: no acausal torture for the pharaohs. Just some locusts, frogs, things of that nature.
Ah yes, im convinced. No wait! You sly devil! Get out of my head robosatan!

Agent ChatGPT is on the case

to be fair, the wifi in wytham abbey probably does suck. it is in the middle of nowhere, after all. this makes a lot of sense to me!

I bet they use starlink

This is funny but after years I still don’t understand what ‘acausal’ means.

They hypothesized a version of decision theory ("timeless decision theory") where having smart enough agents meant that future actions can influence past actions, essentially, violating causality. Some people mocked this as "completely fucking asinine", while others realized the implications ("Roko's basilisk, and it's not completely fucking asinine, it's spooky scary").

Sounds legit.

Rational methods can be used irrationally. You need to model the problem space well enough, and that’s where these rationalists’ own biases etc kick in. How are they sure they’re ‘assessing’ the right priors when they know fuck all about what they know? How could they when they live in a homogeneous social bubble?