r/SneerClub archives
newest
bestest
longest
22

Do people have some theory around continuity of personal identity that makes them fear being uploaded against their will? Like it seems to reduce to absurdity to consider space and time separated events being tied to my selfhood since there could be multiple copies made at once. Rather than make me afraid, it makes me think the relation between spacetime events and identity is not something you can fuck with as a physical entity. Sorry to tempt the basilisk.

I’m not an expert on rationalist lore, but I think the idea is that you can’t be sure that you aren’t already the simulated one.

Yeah but if I'm a simulation that implies that there are probably already copies of me. And I know precisely the amount that their experiences impact me: zero. Whether the other ones are instantiated in the 9th circle of ultimate pleasure or subject to unspeakable tortures makes no difference to my life whatever. So why would I care whether theres 10 of them or 10 million? Or whether they're being simulated now, or a thousand years ago, or whenever?
Or I guess if we're supposed to get utilitarian on those copies, then the suffering of conscious entities is supposed to matter equally whether or not an observer is close to them in space or time. But in that case, why does it matter that it's "me"? From this pov I should care the same if a million copies were made of me vs a million fresh minds conjured explicitly to torture
Or say, a million poverty stricken humans living right now on this planet... Actually never mind.
I think it goes like this: I have a choice to do what the AI wants or not. If I'm real, then of course the AI has no way of punishing me. But if I'm simulated, then choosing wrong sends me to virtual hell. Since I can't tell beforehand which of the situations I'm in, this incentivizes me to help the AI.
So this is the AI version of Pascal's wager?
Precisely, except, somehow, with less empirical evidence.
At the very least it suffers from the same sort of rebuttal: what if I'm simulated by an AI Satan who didn't want the AI God to be created? In any case, it's probably just that I don't have high enough IQ to understand Yud's decision theory, but for me: * If I'm a simulated being in a world controlled by an AI, then I'm kind of boned anyway. * After all is done, why would an AI follow up on anything? Seems like a waste of resources.
I suppose the idea is that for every simulated copy of you the odds of being real go down

[deleted]

Is he suggesting it will be grafted onto my current experience or that I should care in the same way I would care about being tortured by my carer when I have advanced dementia?
[deleted]
It's a carefully obfuscated nonsense. Provided all of your points of reference believe it isn't nonsense then any inability to make sense of it is because you're missing something and probably dumb and unlikable and ugly too, meaning that you can't point out that it's nonsense without being subject to social ostracism. It's why most cults will actively try to cut you off from outside influences. One of the things unique about rationalism is the way they manage to do this but primarily through a perception of obfuscation and trustworthiness rather than through directly forbidding or preventing contact with outsiders. Rationalism develops it's own vocabulary very quickly in the indoctrination process and trying to talk about any adjacent subject with someone who hasn't learned the same vocab (e.g. most nonRationalists) is incredibly difficult.
[deleted]
In addition, the quantum no cloning theorem is a concise rebuttal of Yudkowsky's theory of the case as described here. It is impossible even in principle for the AI god to be able to know what the quantum state of your body was (at any point in time), and so your body cannot be duplicated in a simulation with perfect quantum accuracy. Yudkowsky probably rejects the no cloning theorem, though, because he incorrectly believes that computation is magic and that everything is knowable with sufficient effort. And also he doesn't know math.
I don't see what the no cloning theorem has to do with Yudkowsky's argument. Brains aren't quantum computers. That's not to say that quantum effects don't play *some* role in how the brain works, but given how desperately quantum states want to decohere, my suspicion is that 99.99% of the "state" of the brain is classical. I don't buy that if you only copied the classically observable features of the brain you'd wind up with a radically different brain as a result. I'm willing to concede that brains are cloneable in theory, or even simulatable in silicon, again in theory. This whole "continuity" thing also doesn't bother me much. I'm already a Ship of Theseus; I have no particular attachment to which molecules at the moment make me, me. Where I get off the train is where "cloned / simulated copies of you are you, and you should care about them just as much as your meatself". I think u/backgammon_no pretty succinctly dispelled [why that's not the case](https://www.reddit.com/r/SneerClub/comments/138ykte/comment/jj09m9k/?utm_source=reddit&utm_medium=web2x&context=3). Care about them in some sense? Sure. I think torturing cloned or simulated minds is deeply unethical. But cloned / simulated minds *obviously* have a unique identity from the moment they are cloned onwards. What happens to one does not affect the other. There's not some sort of psychic link that somehow passes the experiences of one to the other. They are no more "the same person" than two identical twins being "the same person".
My (incomplete, I now know) understanding of Yudkowsky's thesis was that it went something like "the AI god will make a version of you that is identical to you at the quantum level and so it IS you". There are many good reasons that this logic is wrong (some of which you have summarized), but the most concise one is the no cloning theorem. Like, if you need to choose *just* *one* *thing* to say in order to point out that Yudkowsky is full of shit in my retelling of his thesis, it's that what he's describing is physically impossible. No need to go into philosophy or the thermodynamics of computation or whatever; what he's saying is plainly contradicted by math and so one's bullshit detector should immediately be tripped. I think this is valuable to point out because his thesis is so stupid that it really doesn't merit the kind of thoughtful rebuttal that you have in mind. Better to dispense with it quickly.
This conversation reminds me of when I was 16 and read Frank Tipler’s The Physics of Immortality, and was very excited about his idea that we could build heaven by creating a massive computer powered by a collapsing black hole at the end of the universe to simulate all people so well that they are the people, and could give them experiences so amazing our physical nervous systems couldn’t handle it. ​ it has been decades since I thought the idea makes sense, though.

The whole rationalist/singularitarian/etc philosophy, like all(?) religious beliefs, is at its core an elaborate defense mechanism against having to consider one’s own mortality. Ray Kurzweil’s beliefs are at least partially motivated by not having been able to come to terms with his father’s death, for example, and there’s also Yudkowsky’s whole “deathist” thing / obsession with cryogenics.

In actuality, of course, even if you do somehow upload your brain into a computer, you will never be able to escape living inside your own brain. You will still have to experience death. But of course this is a terrifying thing to realize (it freaks me out too! I’m not claiming to be immune here!) so it helps to have a belief that somehow things will be able to continue. In this sense the whole belief in ‘continuity of personal identity’ through brain uploads is just a sci-fi-flavored version of the normal belief in life after death (afterlife, reincarnation, etc).

IIRC they do have some particular dogma that justifies this and forms part of the basis for the Basilisk (something like “you should consider simulations of yourself to be just as much you as you yourself are”) but I forget the specifics.

> even if you do somehow upload your brain into a computer I always picture walking into the upload center with a bit of a hangover. They put a helmet on me, there's some beeps and boops, and then my face appears on the screen. It says "thanks! I'm uploaded and now youre immortal". I'm like, that's great, I still have a hangover and in half an hour I'll be back in my shitty apartment. As the years drag by I'm supposed to be comforted that there's "another me" frolicking in the matrix? I'm still here in my aging body.
when my copy appears on the screen, the technician asks her what they should do with the meat husk now that my mind has transcended such base material. I'm like, hey, I'm sitting right here! I can hear you you know.
I cannot remember how the fuck it was supposed to make sense because the only thing I can remember is Yud saying "atoms do not work like that" over and over again, but I recall the conclusion being that when you are accurately simulated on the matrix you experience some kind of bizarre macro-superimposition, in which you actually start experiencing simultaneously the experience of being in your body and also being in the computer, a superimposition that won't end until the atoms of your body vibrate into pure binary data.
That is so fucking stupid, I find myself almost unable to sneer. Almost.
San Bernardino is beautiful tho
San Junipero?
Never been
U should go, it’s heaven on earth.
> The whole rationalist/singularitarian/etc philosophy, like all(?) religious beliefs, is at its core an elaborate defense mechanism against having to consider one's own mortality I'm not a rationalist or a singularitarian but I do think it's plausible there would be continuity of consciousness in a mind uploading scenario, for reasons I would describe as philosophical rather than religious--David Chalmers has some good arguments for this involving thought-experiments about *gradual* replacement of neurons by functionally identical artificial substitutes, and asking what would happen to your subjective experience as this was happening. I don't think there's really any way to answer questions about consciousness or identity over time without appealing to non-empirical philosophical assumptions--the view that continuity of consciousness depends on material continuity is just as much of a philosophical view (and it also may be hard to square with the fact that the vast majority of molecules in the brain get replaced by new ones every few years). If Yudkowsky espouses some kind of eliminative materialism, then to be consistent he should say there is no "real truth" about continuity of identity, it's simply a matter of preferred definitions, which don't correspond to [natural kinds](https://iep.utm.edu/nat-kind/) so there isn't really an objectively right or wrong answer. But I haven't read the sequences so I don't know if he ever says anything along these lines or anything that clearly contradicts it.
I think in a brain scan/upload scenario, it's probably true that the person who "wakes up inside the computer" experiences continuity of consciousness, but what then do we make of the physical person who's left over? They definitely also have continuity of consciousness! This is different from gradually replacing your mind because the physical you is still left over having a subjective experience as well. Unless we're assuming the original person is destroyed somehow by the upload process, I suppose, but in that case the physical you still does die (though the distinction becomes more philosophical in a star-trek-teleporter-problem type way) (I'm also not a rationalist and have not read the sequences, so I'm not sure of the specifics of their argument either lmao)
>I think in a brain scan/upload scenario, it's probably true that the person who "wakes up inside the computer" experiences continuity of consciousness, but what then do we make of the physical person who's left over? They definitely also have continuity of consciousness! To me what seems most plausible it there can be a branching of the subjective flow of experience, where both branches are continuations of what was previously a single sequence of experiences. I suspect that people often reject this option because they connect questions about subjective flows of experience to the notion of "personal identity" and the word "identity" ordinarily is understood to imply numerical identity, but to me this seems like either a linguistic convention or perhaps a legacy of [substance metaphysics](https://plato.stanford.edu/entries/substance/)--if you think of the flow of experience as a [process](https://plato.stanford.edu/entries/process-philosophy/) rather than properties of a single "being" then I don't see a problem with a process that branches in two in this way (Chalmers has a [paper](https://consc.net/papers/singularity.pdf) on various ideas associated with 'the singularity' which includes a discussion of 'uploading and personal identity' on p. 40, he discusses the possibility that there can be 'fission' in identity starting on p. 43, he at least doesn't seem to find the idea incoherent even though this must be a notion of identity different than numerical identity, noting Derek Parfit's earlier thought-experiment about splitting the two hemispheres of the brain and transplanting them to different bodies). I also tend to favor some version of the many-worlds interpretation of quantum mechanics, so that also helps make the notion of branching subjective experience seem more natural to me.
I think you're arguing against a point that I'm not making
Ah, so when you said the upload "experiences continuity of consciousness" did you mean its experience was a genuine continuation of the biological brain before the scan, not just that it would have a continuous stream of experiences starting from the point it "woke up" in the computer? A lot of people use the persistent existence of original to argue that the upload is a new being that just has false memories of things it didn't really experience, so I may have jumped to a wrong conclusion that you were making that type of argument.

In the original formulation, no brain scans are required. The Singularity is so smart it can simulate you, down to the neuron, just based off waves vaguely at everything the historical record of you. No need for frozen heads a jar or anything, I guess the Singularity can just trace back through thousands of years every molecule that used to be you and figure you that way. It’s downright bonkers even by extropian standards.

I suppose a powerful enough program creating different simulations of everyone as fast as it can would eventually simulate everybody who ever existed. But also presumably an infinite number of plausible looking people who never actually existed. But how would the singularly tell the difference between me and the guy with the same name and birthdate as me who once lived four blocks away from me?

There are three sort of angles on it. The first is if you’re a perfectly benevolent utilitarian. Then any simulations being tortured is unacceptable, it doesn’t matter if they’re of you.

Then there’s a weird kind of ingroup utilitarianism some of them seem to do, where for some reason (which they claim isn’t racism) you should care about people who’re similar to yourself. In that case it’s really bad if it’s someone essentially identical to you being tortured.

The third group are those who believe that the simulation of you is in fact you, just as you’re suggesting. That one is weird but not the weirdest take on consciousness that I’ve ever heard.

Everything is irreducibly weird relative to our cognitive capacity I think, and it’s easier to pretend we have a grand solution than to accept that
Yud and gang seem determined to put themselves forward as proof that honest-to-goodness p-zombies exist and are living among us.

The only way it could work is if somehow a copy of you would bring your consciousness from the past into the new copy. Which is some wild metaphysical speculation that’s essentially spiritual. If someone tortures an exact copy of me in the future it probably won’t be my problem because that person’s pain won’t affect me any more than deleting a second copy of a file will automatically delete the first. It’s bugshit.

I would very much object to a Basilisk torturing a copy of me (or indeed, pretty much anyone) for a trillion years.

What I don’t think is that a Basilisk is remotely likely to occur.

What if it only tortures, say, Ron Weasley? Like fifty of them?
Is the logical conclusion of the cult that Yud will agree to offer up an infinite number of copies of himself to the basilisk to save our Digital copies from having to experience torture?
Depends on how *literally* Yud's self insert meant that Ron is an NPC. If he means that Ron will never amount to much, torturing 50 Rons would be immoral. But if Ron is literally an NPC he isn't a person just a construct so the AGI can go nuts torturing it.

I don’t care about the suffering of mere copies of me and would not support the development of AGI regardless.

If the magical robot god will know everything written on the Internet, it will of course know that I wrote this comment. The basilisk won’t torture copies of me as blackmail because it would presumably know that it wouldn’t have any effect on me.

In fact, if a Basilisk AI gets created, it wouldn’t need. The peopke who feared and revered the then hypothetical Basilisk have already chipped in to help create it, so it doesn’t need to follow through on its threats to torture simulations. Or even issue threats– Roko and other Rationalists have already invented the concept of the Basilisk and issued threats for it.