He literally admits to that in a follow up tweet without any sense of reflection.
https://twitter.com/ESYudkowsky/status/1584719334873894913?t=hPLXo-rwMXd7jk-ZFhJhnw&s=19
No you see, there is logic to this madness. That AI [only affects *some* black people whereas he cares only about things that affect *all people*.
](https://twitter.com/ESYudkowsky/status/1584801579042865153?t=d3W3h3e2pO6-aThh78R0WA&s=19)
He took all lives matter to a logical extreme in a way that only he could.
from further down the thread:
"Now, it's true that solving problems that only kill \*some\* people, can sometimes help in solving much larger problems that will kill \*all\* people. But most solutions to little problems don't actually help solve big ones. You have to pick them with a severe, careful eye."
the entire history of science research has some exciting news for him about how people end up solving big problems
And let's remember, the issue that affects all people that he's working on here: the problem of getting an illustration for the fantasy novel he's writing without having to pay an actual artist.
The truly enlightened do not have posteriors (probabilistically), as that is terminology from frequentist drivel and also the 666th sign of the Basilisk.
Sounds like the sort of AI problem that would only kill some
not all black people? I prioritize AI issues that I expect to
kill every black man, woman, and child on the planet. Anyone who doesn’t
so prioritize, of course, doesn’t actually give a shit about black
lives.
Just wait till he gets something into his eye, he will make it into some galaxy sized problem. [Oh no](https://www.lesswrong.com/posts/3wYTFWY3LKQCnAptN/torture-vs-dust-specks)
I mean since he's so brilliant and rational he's clearly well-equipped to solve any problem, which means bringing him to his attention is the most effective way to find solutions, and since some problems like stubbing your toe or having your computer crash before you can hit Dave could conceivably one day lead to total extinction under not-impossible circumstances it would be irresponsible of us *not* to devote a disproportionate amount of society's time and energy to personally inconveniencing him as much as possible.
“Now that this problem affects me personally….it is now a real problem.”
How smooth brained do you have to be to think that calling opinions priors magically makes yours more valid
AI disproportionately imprisons people of color: I sleep
AI doesn’t want to draw my fan art: real shit?!?!?!
proceeds to pull priors out of his posterior
Sounds like the sort of AI problem that would only kill some not all black people? I prioritize AI issues that I expect to kill every black man, woman, and child on the planet. Anyone who doesn’t so prioritize, of course, doesn’t actually give a shit about black lives.
[deleted]
Well, now we know Mr. Eliezer Yudkowsky needs to personally experience a problem for it to be real. Let’s begin a task force at once.
A perfect example of the many things which are wrong with Rationalism.
I want so much to reply with a sneer but I don’t want my alt to get blocked too