“I hate this whole rationality thing. If you actually take the basic assumptions of rationality seriously (as in Bayesian inference, complexity theory, algorithmic views of minds), you end up with an utterly insane universe full of mind-controlling superintelligences and impossible moral luck, and not a nice “let’s build an AI so we can fuck catgirls all day” universe. The worst that can happen is not the extinction of humanity or something that mundane – instead, you might piss off a whole pantheon of jealous gods and have to deal with them forever, or you might notice that this has already happened and you are already being computationally pwned, or that any bad state you can imagine exists. Modal fucking realism.” -muflax
Link: http://web.archive.org/web/20141013085708/http://kruel.co/backup/Ontological%20Therapy.html
I feel like this might be the point that kept driving me away when trying to read LessWrong. The stupid excessive extrapolation that becomes idiotic at the second step, let alone the AI god step.
That guy is running so damn fast toward his nifty sci-fi cosmology that he managed to mishandle almost every concept there ever was. You can tell he has not been inside a physics classroom.
As far as I know, modal realism by way of computational idealism (?) isn’t orthodox Rationalist doctrine, but EY did write some Greg Egan mega-crossover fanfiction based on that idea (it feels like their most original-seeming ideas are always stolen from like Greg Egan or Douglas Hofstadter or whichever writer). You’d think he’d be less worried about existential risk if he believed it? IDK.
I never got really deep into the FDT/UDT lore even though “bash every philosophical problem with information theory” is probably the only strand of Rationalist thought that I find genuinely interesting.
I also find it amusing that when this was written, modal realism seemed like a deeply weird idea that you need a huge brain to take seriously, but now we have TikTok teens with their reality shifting.
Listening to these people, I can almost understand what caused the neo-reactionaries to show up in these spaces. You brainblast yourself on AI threat all day, and so it’s no wonder they want to burn all the processor boards and return to monke.
Well, except for the ones who want to make the Basilisk and have it send all feminists to hell for them.
Big joke haha, but it says something in itself. The nearest thing to a rationalist moral consideration is self-concerned, i.e. what if the AGI tortures “me?” Wrong locus. What if consciousness for the AGI approximates torture, as it seems to generally?
Or to ground this a bit – existence as given – what if the catgirls don’t want to fuck you?
should you flair this NSFW?
Isn’t this Orion’s Arm shit with 16 posthuman gods
I don’t normally like to opine on other people’s mental health, no matter now eccentric their behavior. But man, if OP is going to flat-out tell me they were having a psychotic break, all I can say is that I hope they’re in a better place now.
P.S. - remember that you’re not alone! the next time you dwell on the agony of being embedded in the substanceless procession of natural law, go back to the granddaddy of ontological suffering prevention and try buddhism on for size