Can’t you interpret any sufficiently complex system as being a
simulation of any other simpler system, just by arbitrarily mapping the
states of one to the other? I mean, it’s a purely human interpretation
to say the currents going through a few billion transistors actually
represent a simulated world that they don’t physically resemble in any
way. It’s kind of required if you want to allow the whole mind uploading
thing, but I still think it’s a really weird idea.
Edit: Actually, I remember Yudkowsky wrote a mega-crossover
fanfiction based on this exact argument. Something about universes being
Turing machines that are all mutually simulating each other, and you can
travel into a work of fiction by uploading yourself into a simulation of
it and pulling the plug because the simulation just gets continued
somewhere else.
It’s pretty weird how LessWrong manages to take hard-nosed scientific
reductionist concepts and ends up arguing for crazy ideas like AI gods
and universe simulations and modal realism.
What I always find ridiculous about this argument is that we don't yet know if the universe/human consciousness is a computable function.
If you go read Turing's original proofs/Godel's incompleteness theorem, you learn that there some functions that can't be solved on a Turing a machine. The Halting Problem is an example of this.
Yet, I've yet to see a LWer address this. What if human consciousness cannot be computed? Although we don't know the answer to this, it's entirely possible you can't simulate human consciousness/the universe on a Turing machine.
Can’t you interpret any sufficiently complex system as being a simulation of any other simpler system, just by arbitrarily mapping the states of one to the other? I mean, it’s a purely human interpretation to say the currents going through a few billion transistors actually represent a simulated world that they don’t physically resemble in any way. It’s kind of required if you want to allow the whole mind uploading thing, but I still think it’s a really weird idea.
Edit: Actually, I remember Yudkowsky wrote a mega-crossover fanfiction based on this exact argument. Something about universes being Turing machines that are all mutually simulating each other, and you can travel into a work of fiction by uploading yourself into a simulation of it and pulling the plug because the simulation just gets continued somewhere else.
It’s pretty weird how LessWrong manages to take hard-nosed scientific reductionist concepts and ends up arguing for crazy ideas like AI gods and universe simulations and modal realism.