We all know that a white rooster is, in actuality, the most powerful information aggregator produced by the universe so far.
I guess this is all par for the course when your philosophy is constructed upon a firm bedrock of psychohistory.
It has the character of a rationalist eschatological conflict. All of the authors are economics grad students, and the intention seems to be to argue against the plausibility of a near-term AI apocalypse:
https://twitter.com/BasilHalperin/status/1612852859908902913?s=20&t=bd691uquNevDJmfjctQ37w
It seems like these are members of a more reputable branch of the religious movement who are pushing back against the, ahem, less reputable Yudkowskian branch.
I should clarify; Yudkowsky is considered less reputable *by the rest of us*. I think some rationalists/EA are aware of how their beliefs are perceived by mainstream industry and academia and they're not comfortable with it. I imagine that feeling is especially acute when you're e.g. a grad student at MIT and you often interact with people who are outside the rationalist bubble.
Yeah those people exist for sure, but there are others (usually better-educated) that know how the mainstream actually sees them. They tend to be cagey about revealing themselves in mixed company. One way to know them is by the fact that they tend to speak up defensively when you malign the attempts that the "ai safety" community has been making at inroads to respectability in academic work.
It's a subtle but interesting schism. I imagine it's a bit like how the non-polygamous mormons feel about the polygamous ones.
“Siri, how can I make it sound like I’m doing research while I’m
actually spouting completely unhinged fantasy, & which other
unhinged fantasists can I cite as part of my quest?”
The corresponding lesswrong post has this comment
explaining the distinction between efficient markets and predicting the
future:
markets remain in an inadequate equilibrium until the end of times,
because those participants (like myself!) who consider short timelines
remain in too small minority to “call the bluff”. see the big short for
a dramatic depiction of such situation.
You see, rationalists are a distinguished minority whose access to
revelations about the imminent robopocalypse may grant them
opportunities to become fantastically wealthy through clever, well-timed
market speculation strategies.
This is a fun remix of millenarianism and prosperity gospel: one can
achieve wealth through the power of one’s faith specifically by gambling
on the precise timing and nature of the apocalypse.
[deleted]
[deleted]
“Siri, how can I make it sound like I’m doing research while I’m actually spouting completely unhinged fantasy, & which other unhinged fantasists can I cite as part of my quest?”
The corresponding lesswrong post has this comment explaining the distinction between efficient markets and predicting the future:
You see, rationalists are a distinguished minority whose access to revelations about the imminent robopocalypse may grant them opportunities to become fantastically wealthy through clever, well-timed market speculation strategies.
This is a fun remix of millenarianism and prosperity gospel: one can achieve wealth through the power of one’s faith specifically by gambling on the precise timing and nature of the apocalypse.
Technical analysis is astrology for men, but like with everything rationalists take it to the next level
Or better yet, neither!
Capitalism is the only way we can fight back against the AGI