Me and some other people were discussing things like global warming and what to do about it (world communism) when this person points out that hey, even catastrophic and irreversible climate change won’t actually kill the species. Which is probably true, but little consolation for the soon-to-be billions of climate refugees.
After this, this person also claimed that actually global thermonuclear war probably wouldn’t wipe out the species. While this is also maybe hopefully true, I wouldn’t take the worst case scenario that most people can imagine so lightly. So here is where I start to become suspicious. They then go on to say that they’re concerned about the long-term threats to the species and I’m hoping they’ll bring up the need to look out for killer asteroids or something. But no, it’s actually Evil AI that is the biggest threat to the species.
What follows is a discussion where I point out that 1) we don’t know what intelligence is 2) we don’t know if reality is computable 3) machines do not create value 4) all “AI” systems so far do little more than regression 5) autonomous systems that can misbehave already exist and 6) an evil self-replicating AI/egregore already exists and it is called capital
What I found really fascinating is this person’s view of “AI” is seemingly lifted from science fiction. When I point out that I know people in the field, and that all of them think these worries are silly, they counter by saying that the AI safety “field” disagrees. When I ask how any of the present “AI” systems are to reproduce themselves they don’t consider this much of a problem, despite the fact that all present AI systems need human minders. How exactly an unknown function approximator is going to take over the world is apparently not interesting to the people in this “field”.
Oh and somewhere in the middle of this my suspicions were confirmed by them saying they are indeed a member of lesswrong. Which certainly explains why the very concrete fears of climate disasters or thermonuclear war can be dismissed in favor of worrying about computers throwing numbers around.
If nuclear war and climate change aren’t potentially existential threats what other means would AI have to drive humanity into extinction?
Those seem like obvious choices.
Ha, but the human race didn’t die out! I tell my 4 surviving kids (the other 10 died in childbirth) while we subsistence farm on the north pole.
And if anybody mentions the AI safety field (which also consists of two groups iirc, the one who goes ‘ow god this will just reinforce so much racism’ and the other who goes ‘PAPERCLIPS YOU DO NOT THINK IN ENOUGH DEPHT ABOUT FUTURE PAPERCLIPS!’. but clearly here it was just the latter) they either are a LW person or read the popular culture spinoffs from it. Next time, just say ‘well we can just put the AI in a box’ and see their eyes light up with smugness because you fell into the AI in a box experiment trap.
The annoying thing is that the threat of AI is a lot like the threat of climate change: rich people ignoring externalities in order to plunder and pass off the costs to others.
My favorite LWer-in-the-wild story is the time I ran into someone who chastised me for not taking Roko’s basilisk seriously since, and I quote, “the smartest people in the world” think it’s the biggest issue facing humanity now and in the future.
Rationalism is when I get my priors from an Isaac Asimov book.
Since the AI safety argument immediately pivots to “fund my friend’s AI-risk alignment non profit” I was always weirded out by it.
In the telling of it, well the way they used to, “AI x-risk is not being worked on” was the justification for the “E” of the EA part – your dollar is highly effective because no one is working on it!
Well shit, we’ve spent this money on the AI alignment problem and what do we have? Oh wait you need more time? Okay….
this might just be me personally but I come here to sneer at EY, assorted Scotts, and other dumbass thought leaders. sneering at some rando you met seems less fun to me
I met a LWer once, and this person was just strange in every way. A helluva writer and pleasant enough to be around but their sense of reality and concerns were just wildly off. They also made me hide all the toothpaste in the house because they were so sensitive to mint that they could, apparently, smell it in the tube.
Then I accidentally found out that another one of my then-friends was LW-adjacent (I think she had a friend or partner who was into it) and got to witness the power of the cult in full force. Basically I made some joke about Roko’s Basilisk and this chick absolutely lost it on me, refused to believe they were a cult or fashy or anything but lovely and harmless. It was my first insight into this person’s personality disorder and an incredibly jarring experience.
as you do comrade
Is false.
We’re on our way to destruction of the biosphere. Even if we’re not and we survive in some small way we have used all the easily (economically viable) resources we used to build our society.
There is no Earth 2. There is no Human civilization 2.0 on earth 1 either.
Stopped there.
World communism? Haha fucking gross dude. Please leave me out of that shit.
Why wouldn’t it be?
What?