“I have some very basic strong philosophical differences with him on
very fundamental aesthetic, almost primal levels. I look at his work and
am viscerally repulsed. I read his ideal society and see what sounds to
me like a description of Hell.”
I read Moldbug here and there many years ago, and have occasionally
attempted to re-engage with him since he emerged from his cave, and this
is exactly the reaction I always had- and still do- to his thought.
Thanks for posting this.
Long story short: last year the editorial staff of Current Affairs was set to reorganize the magazine into a worker owned co-op model. As they were about to implement this, Robinson got cold feet because he would no longer have editorial control over the magazine and effectively fired the editorial staff before they could effect the change. He said in a personal statement later that he was basically okay with profit sharing (to the extent that would ever be a concern with a niche political publication such as this) but could not abide the thought that he would not have the final say in who was hired and what was published - that is to say, he could not abide the idea that a magazine explicitly devoted to libertarian socialist politics would be organized democratically instead of with him as the unquestioned boss:
>“…I think I should be on top of the org chart, with everybody else selected by me and reporting to me. I let Current Affairs build up into a sort of egalitarian community of friends while knowing in my heart that I still thought of it as *my* project over which I should have control.”
[Here’s the original letter from the editorial staff after Robinson fired them](https://mobile.twitter.com/lyta_gold/status/1428011761635143681/photo/1), and here’s an [article from Vice covering this at the time](https://www.vice.com/en/article/n7bmd7/socialist-publication-current-affairs-fires-staff-for-doing-socialism). Robinson, for his part, [denies that he was retaliating against his employees for organizing for better working conditions](https://web.archive.org/web/20210820153713/https://twitter.com/nathanjrobinson/status/1428722480567435272) - but then, that’s what bosses always say, isn’t it?
Funny to bring up Robinson in this context, given Sandifer is a Doctor Who fan. His [Cuban dandy outfit](https://mpd-biblio-authors.imgix.net/200068638.jpg?fit=crop&crop=faces&w=290&h=290) looks a hell of a lot like something the Seventh Doctor would wear.
e: or the Fifth, now that I think about it.
Oh, God. Okay. So, Curtis Yarvin came to present prominence—got his
initial readership before he spun off to his own blog—on a website
called Overcoming Bias, a website loosely organized around a community
that called themselves “the rationalists.” The main figure in that is a
guy named Eliezer Yudkowsky, who would describe himself as an AI
researcher. It’s important to note that he has literally no computer
science qualifications; cannot—to the best of my knowledge—code; has
never built an AI; and does not actually understand anything about how
AI works on a technical level. But he is an AI researcher, which really
means he writes science fiction. He writes science fiction novels that
he passes off as philosophy and scholarship. He is horribly obsessed
with the idea that someday an artificial intelligence is going to wake
up, achieve sentience, take over the world, and destroy humanity because
it sees no point in humanity. He writes great science fiction phrases.
He’s got a phrase: “The AI does not love you. The AI does not hate you.
But you are made out of atoms which the AI can use for something else.”
That’s charming and chilling, and throw that into a science fiction
horror book about an evil AI and you’re going to get a Hugo nomination
for that stuff. As an analysis of computer science and the state of play
of current technology, it has nothing to do with anything that is
actually happening in AI research, nanotechnology, or anything else.
It’s purely science fiction. But it’s pretty good science fiction. And
so a lot of tech bro people are really, really into him because he makes
them feel good. He says that they’re all super logical, rational people,
and they can learn to make no mistakes if they just use his one weird
trick for thinking rationally. He’s just had a lot of influence despite
being frankly a kind of weirdo cult leader. But the Basilisk. What you
actually asked about. The Basilisk comes from an incident that arose in
Yudkowsky’s community where this guy named Roko, who went on to be a
fascist, came up with a thought experiment imagining a futuristic,
godlike AI. As I said, they’re terrified of an evil AI. They also want
to create a god AI that will reincarnate them on a hard drive so they
can live forever. And so this guy Roko imagined the god AI and said:
Wait a minute, what if when the god AI exists, he looks back at everyone
who failed to help bring him about and declares they’re evil, and should
be reincarnated on a computer and tortured for all eternity? He made
this argument that was entirely consistent with the many weird cult-like
premises of Yudkowsky and his rationalists and created this idea of this
godlike AI that would torture them all if they didn’t give all their
money to AI research to try to bring him about—which, if you look at it
from a perspective of not being a weirdo AI cult member, is basically
just reinventing Pascal’s Wager.
ROBINSON
Pascal’s wager being that it pays to believe in God because if you
don’t, God will punish you—if he exists.
SANDIFER
Yes, good explanation. And so all of these AI cultists, broadly
speaking, absolutely lost their shit. They had an epic meltdown-panic
attack. Yudkowsky was, at one point, screaming in all caps about how the
worst thing you can possibly do is talk about the evil godlike AI in the
future that does this, because talking about it brings it into
existence. Everyone is having a complete emotional meltdown over having
accidentally invented Pascal’s Wager. And the whole incident eventually
becomes a bit of popular lore that people who are the right kind of nerd
know about. Jokes about Roko’s Basilisk, which is what this whole affair
became known as, were actually what got Elon Musk and Grimes together.
They both made the same pun about Roko’s Basilisk independently and
found each other through it. ROBINSON
Wow. I never knew that.
SANDIFER
My friend, David Gerard, who was the initial reader and editor of
Neoreaction a Basilisk, was the one who preserved all the transcripts of
the meltdown and put them on RationalWiki. That’s why anyone knows about
this. So he is ultimately single-handedly responsible for Elon Musk
taking over Twitter just by popularizing Roko’s Basilisk. It’s horrible.
He feels terrible about it.
ROBINSON
I fear that some of our listeners, hearing your explanation, may have
thought to themselves at some point during…
SANDIFER
What the fuck is going on here?
ROBINSON
“I don’t understand this. It’s bizarre.”
SANDIFER
I should have prefaced this with: What I am about to say is going to
sound completely insane, and that’s because it is.
ROBINSON
I’m glad you explained it because I think that it’s important to
understand that even if you don’t grasp this whole thing about a godlike
artificial intelligence in the future and whatever…
SANDIFER
And you should feel better about yourself if you don’t. If it did
make any sense, you should really be worried.
ROBINSON
First, the people who believe in this very bizarre thing consider
themselves to be extremely logical—more logical than anyone else,
right?
SANDIFER
Yes. Functionally, they believe themselves to be, if not infallible
on an individual level, at least infallible on a collective level.
ROBINSON
Secondly, this rationalist community that you’re talking about that
drifts into extremely bizarre and sometimes fascist beliefs is quite
influential in Silicon Valley.
SANDIFER
Hugely so. If you talk not just to management, but even many of the
frontline software engineer/coder nerds, they all know who Eliezer
Yudkowsky is. This is absolutely a household name within the specific
bubble and enclave of Silicon Valley tech.
ROBINSON
And there’s an entire intellectual ecosystem here. You’ve written
about the Slate Star Codex blog.
SANDIFER
Ah, yes, Mr. Siskind.
ROBINSON
He’s this rationalist who’s very opposed to social justice politics
and is, perhaps, a little too open-minded about Charles Murray and…
SANDIFER
He’s a gateway to outright fascist ideas. He has openly said that he
is a race eugenicist who believes that IQ is heritable. He definitely
believes this to be true. He has said as much. He plays a little coy in
public, but in his personal beliefs, he is a racist authoritarian. I
absolutely believe this.
ROBINSON
And he is extremely popular among some people. He has a big following
among a lot of these Silicon Valley types.
SANDIFER
Absolutely. His blog was widely considered essential reading among
the Silicon Valley types. And then you go to the subreddit for his blog,
and people are literally posting the 14 words, which are a huge white
nationalist slogan and just not even a dog whistle, just a whistle.
>Curtis Yarvin came to present prominence—got his initial readership before he spun off to his own blog—on a website called Overcoming Bias
The first post on Yarvin's blog "Unqualified Reservations" dates from [April 2007](https://web.archive.org/web/20070715151131/http://unqualified-reservations.blogspot.com/2007/04/formalist-manifesto-originally-posted.html), and is itself based on a guest post he made for 2blowhards.com, a 2000s-vintage vaguely conservative blog where he was [already known to the readership](https://www.2blowhards.com/archives/2007/04/_trial_version.html). Just as "Moldbug", he was [also known](https://www.interfluidity.com/v2/93.html) to readers of the finance blog of Brad Setser (who later worked in Obama's Treasury department). He was also already a regular at Razib Khan's "Gene Expression" blog [in June 2007](https://www.gnxp.com/blog/2007/06/against-ultracalvinists.php). The earliest reference I can find at "Overcoming Bias" is in the first open thread, [dating from July 2007](https://web.archive.org/web/20071012020006/http://www.overcomingbias.com/2007/07/open-thread.html).
My point is just that he was known at a variety of places in the mid-00s blogosphere (see the blogroll accompanying that April 2007 post), and his readership always reflected that.
This seems like a good correction, and it doesnt improve the situation one bit. 'Mildbug is a wonderful writer'.
E: and the various comments show just how much in 2007 were all still going on at 'everthing i dislike [the sjws] is christianity' or 'christianity is good actually [because atheism is irrational]'. I can understand why Yud though 'wow I really need to teach these people how to think'.
What strikes me is that someone who really loves order
should hate monarchy and dictatorships – if you look into the history of
e.g. nazi germany or absolute monarchies it’s obvious how chaotic
government is when it all hangs on the whims of one person (especially
when they’re a genocidal lunatic). And that’s ignoring the obvious
issues of disputed successions, incompetent failsons, and decadence of
the absolutely powerful.
Liberal democracy is flawed in many ways, but it’s basically the most
orderly system that’s ever been created.
Edit: but this really reinforces the point that these people are
stupid and have bad ideas.
Yeah these guys explicitly claim that monarchies have fewer and simpler laws, and that makes everyone freer and more orderly. You need to know literally nothing about all of history starting with the Roman Empire and finishing with European monarchies.
An important point about reactionaries is that they want back to an imagined past, that this past never worked like they imagined it doesn't matter that much. (In fact, I think that is what makes them reactionary. Not the whole 'reacts to things' more common usage of the term (There is also the 'just accuse my political enemies of being reactionaries usage of the term but that is more a Stalinist thing (E: which does make me wonder how Tankies would react if anarchists (who were called reactionaries by Stalin/Lenin and you know *vanished*) start calling them reactionaries))).
And yes liberal democracy works super in the edge cases where there is a 'succession' in the case of dictatorships and monarchies these relative rare events (as a ruler needs to die/be disposed) almost always leads to a big crisis. But liberal democracy just trolls on these government forms by going 'lol we gonna do that every 4 years, and for several layers of the government'.
> (E: which does make me wonder how Tankies would react if anarchists (who were called reactionaries by Stalin/Lenin and you know vanished) start calling them reactionaries)
Anarchists call tankies "red fash" regularly. Or did you mean use the exact work "reactionaries"?
'Reactionaries' exactly, esp for the whole historical weight the word carries re anarchists. (Of course, this kinda assumes the tankies actually know a little bit of history which yeah... tankies).
“Yudkowsky was, at one point, screaming in all caps about how the
worst thing you can possibly do is talk about the evil godlike AI in the
future that does this, because talking about it brings it into
existence.”
“I have some very basic strong philosophical differences with him on very fundamental aesthetic, almost primal levels. I look at his work and am viscerally repulsed. I read his ideal society and see what sounds to me like a description of Hell.”
I read Moldbug here and there many years ago, and have occasionally attempted to re-engage with him since he emerged from his cave, and this is exactly the reaction I always had- and still do- to his thought. Thanks for posting this.
Loved the interview, though I think Robinson is a chump and a hypocrite, which is a sour note in an otherwise great discussion.
Here is the part related to SSC:
SANDIFER:
Oh, God. Okay. So, Curtis Yarvin came to present prominence—got his initial readership before he spun off to his own blog—on a website called Overcoming Bias, a website loosely organized around a community that called themselves “the rationalists.” The main figure in that is a guy named Eliezer Yudkowsky, who would describe himself as an AI researcher. It’s important to note that he has literally no computer science qualifications; cannot—to the best of my knowledge—code; has never built an AI; and does not actually understand anything about how AI works on a technical level. But he is an AI researcher, which really means he writes science fiction. He writes science fiction novels that he passes off as philosophy and scholarship. He is horribly obsessed with the idea that someday an artificial intelligence is going to wake up, achieve sentience, take over the world, and destroy humanity because it sees no point in humanity. He writes great science fiction phrases. He’s got a phrase: “The AI does not love you. The AI does not hate you. But you are made out of atoms which the AI can use for something else.” That’s charming and chilling, and throw that into a science fiction horror book about an evil AI and you’re going to get a Hugo nomination for that stuff. As an analysis of computer science and the state of play of current technology, it has nothing to do with anything that is actually happening in AI research, nanotechnology, or anything else. It’s purely science fiction. But it’s pretty good science fiction. And so a lot of tech bro people are really, really into him because he makes them feel good. He says that they’re all super logical, rational people, and they can learn to make no mistakes if they just use his one weird trick for thinking rationally. He’s just had a lot of influence despite being frankly a kind of weirdo cult leader. But the Basilisk. What you actually asked about. The Basilisk comes from an incident that arose in Yudkowsky’s community where this guy named Roko, who went on to be a fascist, came up with a thought experiment imagining a futuristic, godlike AI. As I said, they’re terrified of an evil AI. They also want to create a god AI that will reincarnate them on a hard drive so they can live forever. And so this guy Roko imagined the god AI and said: Wait a minute, what if when the god AI exists, he looks back at everyone who failed to help bring him about and declares they’re evil, and should be reincarnated on a computer and tortured for all eternity? He made this argument that was entirely consistent with the many weird cult-like premises of Yudkowsky and his rationalists and created this idea of this godlike AI that would torture them all if they didn’t give all their money to AI research to try to bring him about—which, if you look at it from a perspective of not being a weirdo AI cult member, is basically just reinventing Pascal’s Wager.
ROBINSON
Pascal’s wager being that it pays to believe in God because if you don’t, God will punish you—if he exists.
SANDIFER
Yes, good explanation. And so all of these AI cultists, broadly speaking, absolutely lost their shit. They had an epic meltdown-panic attack. Yudkowsky was, at one point, screaming in all caps about how the worst thing you can possibly do is talk about the evil godlike AI in the future that does this, because talking about it brings it into existence. Everyone is having a complete emotional meltdown over having accidentally invented Pascal’s Wager. And the whole incident eventually becomes a bit of popular lore that people who are the right kind of nerd know about. Jokes about Roko’s Basilisk, which is what this whole affair became known as, were actually what got Elon Musk and Grimes together. They both made the same pun about Roko’s Basilisk independently and found each other through it. ROBINSON
Wow. I never knew that.
SANDIFER
My friend, David Gerard, who was the initial reader and editor of Neoreaction a Basilisk, was the one who preserved all the transcripts of the meltdown and put them on RationalWiki. That’s why anyone knows about this. So he is ultimately single-handedly responsible for Elon Musk taking over Twitter just by popularizing Roko’s Basilisk. It’s horrible. He feels terrible about it.
ROBINSON
I fear that some of our listeners, hearing your explanation, may have thought to themselves at some point during…
SANDIFER
What the fuck is going on here?
ROBINSON
“I don’t understand this. It’s bizarre.”
SANDIFER
I should have prefaced this with: What I am about to say is going to sound completely insane, and that’s because it is.
ROBINSON
I’m glad you explained it because I think that it’s important to understand that even if you don’t grasp this whole thing about a godlike artificial intelligence in the future and whatever…
SANDIFER
And you should feel better about yourself if you don’t. If it did make any sense, you should really be worried.
ROBINSON
First, the people who believe in this very bizarre thing consider themselves to be extremely logical—more logical than anyone else, right?
SANDIFER
Yes. Functionally, they believe themselves to be, if not infallible on an individual level, at least infallible on a collective level.
ROBINSON
Secondly, this rationalist community that you’re talking about that drifts into extremely bizarre and sometimes fascist beliefs is quite influential in Silicon Valley.
SANDIFER
Hugely so. If you talk not just to management, but even many of the frontline software engineer/coder nerds, they all know who Eliezer Yudkowsky is. This is absolutely a household name within the specific bubble and enclave of Silicon Valley tech.
ROBINSON
And there’s an entire intellectual ecosystem here. You’ve written about the Slate Star Codex blog.
SANDIFER
Ah, yes, Mr. Siskind.
ROBINSON
He’s this rationalist who’s very opposed to social justice politics and is, perhaps, a little too open-minded about Charles Murray and…
SANDIFER
He’s a gateway to outright fascist ideas. He has openly said that he is a race eugenicist who believes that IQ is heritable. He definitely believes this to be true. He has said as much. He plays a little coy in public, but in his personal beliefs, he is a racist authoritarian. I absolutely believe this.
ROBINSON
And he is extremely popular among some people. He has a big following among a lot of these Silicon Valley types.
SANDIFER
Absolutely. His blog was widely considered essential reading among the Silicon Valley types. And then you go to the subreddit for his blog, and people are literally posting the 14 words, which are a huge white nationalist slogan and just not even a dog whistle, just a whistle.
confession: I feel 0% terrible about this, and near 100% that it’s funny as shit
Great interview.
What strikes me is that someone who really loves order should hate monarchy and dictatorships – if you look into the history of e.g. nazi germany or absolute monarchies it’s obvious how chaotic government is when it all hangs on the whims of one person (especially when they’re a genocidal lunatic). And that’s ignoring the obvious issues of disputed successions, incompetent failsons, and decadence of the absolutely powerful.
Liberal democracy is flawed in many ways, but it’s basically the most orderly system that’s ever been created.
Edit: but this really reinforces the point that these people are stupid and have bad ideas.
Solid interview, thanks for posting it! I will never get tired of Sandifer’s style. She’s so delightfully witty and biting.
“Yudkowsky was, at one point, screaming in all caps about how the worst thing you can possibly do is talk about the evil godlike AI in the future that does this, because talking about it brings it into existence.”
Total ripoff of the Stay-Puft Marshmallow Man.
She’s a gift and a treasure. I will never get enough of reading about this bizarrely influential batshittery.
She of Doctor Who fame? Nice, looks like she’s moving up.