This is definitely not safe for work, and may be heavy for many people: the examples come from this week’s genocidal attack on Ukraine’s civilian infrastructure. Timothy Snyder on Twitter shared ten short guidelines for writing about the catastrophe. #6 made me think about this Club (emphasis mine):
When a story begins with bothsidesing, readers are instructed that an object in the physical world (like a dam) is just an element of narrative. They are guided into the wrong genre (literature) right at the moment when analysis is needed. This does their minds a disservice.
This short explanation is beautiful to me. It gave me more clarity than I’ve ever had on bothsidesism. Like:
Stories can complement analysis in helpful and cute ways: “Don’t anthropomorphize LLMs, they hate that.” To err and mix up stories with analysis is human. To keep treating physical/historical/computing objects as narrative objects, repeatedly and systematically, while informing others? Sneer-worthy!
UPDATE: there’s a sneer-worthy example of bothsidesism in a comment. I took a screenshot; when those fantastic narratives flip-flop or disappear, that’s like +10 buff to sneer-worthiness. Oceania had always been at war with Eastasia.
My interpretation of what you’ve said is that rationalists, for whatever reason, are more concerned about crafting a story of a war between the AI and humanity, rather than performing any real analysis.
I risk being too generous here, but perhaps all this talk of AGI overtaking human intelligence is an expression of angst over how humanity has achieved much technological progress, but has yet to nail down fundamental, philosophical aspects of the human condition itself. I think it’s conceivable that someone mostly accustomed to thinking in terms of STEM, who hasn’t developed the tools to effectively navel-gaze would especially feel frustrated by this, and outwardly dismiss the multitudes of schools of thought that try to address it.
One of the ideas that pervades AI doomerism is that AGI will automagically derive rules about the universe faster than humans. This is an expression of frustration that the future isn’t now, and we can see them bargaining with this frustration via the Pinkerish idea that now is the best time to live in human history- it’s the best it’s ever been, so don’t feel bad about how it might be better in the future.
Maybe the rationalist desire to live forever in simulation is at heart a desire to have the time to explore human existence, rather than to indulge on cyber-soma in a digital Xanadu. Of course, I think that would be hubris- death is so central to our existence that without it, we wouldn’t be human.
ahem I mean, how bout we stop anthropomorphising these idiot rationalists, amirite?
Tldr: why it’s wrong to be objective when reporting on a war that America has taken sides on.
I mean, literally, that Twitter thread says reporters are behaving unethically if they don’t explicitly state Ukrainians tell the truth and Russians lie. If that’s not valorizing yellow journalism I don’t know what is.