• 2 Posts
  • 1.37K Comments
Joined 2 years ago
cake
Cake day: March 22nd, 2024

help-circle
  • Ironically I think it’s also been discussed most frequently within Rationalist circles that these types of intelligence aren’t often correlated. I’m not going to chase down links right now because doing an SSC archive exploration requires more mental fortitude than I currently possess, but I distinctly remember that a recurring theme was “if nerds are so smart why don’t they rule the world?” In my less cynical days I had assumed that his confusion on this point was largely rhetorical, intended to illustrate some part of whatever point was buried in the beigeness. Now it seems like I was falling victim to the ability to project whatever tangentially-related thesis you want onto the essay and find supporting arguments because of how badly it’s written.



  • On a purely rhetorical point, it seems like the whole counterargument from Gwern is just an argument-by-disorganization or something to that effect. He doesn’t actually challenge the factual information presented, but does shift how those facts are framed and what the actual contention is in the background, and then avoids actually engaging with the new contention from the bottom up.

    In a lot of discussions with singularity cultists (both pros and antis) they assume that a true superintelligence would render the whole universe deterministically predictable to a sufficient degree to allow it to basically do magic. This is how the specifics of "how and why does the AI kill all humans again?’ tend to be elided, for example. This same kind of thinking is also at the heart of their obsession with “superpredictors” who can, it is assumed, use some kind of trick to beat this kind of mathematical limit in certainty (this is the part where I say something about survivorship bias). In the context of that discussion, the fact that a relatively simple arrangement of components following relatively simple, deterministic rules is still not meaningfully predictable past a dozen or so sequential events due to the magnification of the inevitable error in our understanding of the initial circumstances is a logical knockout.

    Rather than engage with this, however, Gwern and his compatriots in the thread focus in on the tangent about how high-level pinball players are able to control for that uncertainty by avoiding the region of the board where those error-magnifying parts are. However this is not the same argument and begs the question of whether those high-chaos areas are always avoidable as they are in a pinball machine. Rather than engage with that question, Gwern doubles down on the pinball analogy, shifting the question even further from “how well can we predict the deterministic motion of a ball given the inevitable uncertainty of our initial state” to “how many ways can we convince a third party we’ve gotten a high score on a pinball machine”. At this point we’re not just moving the goalposts, we’ve moved the entire stadium into low earth orbit and gotten real cute about whether we’re playing 🏈 or ⚽ football.

    And given the conversation surrounding the thread and these topics on LW I’m not even going to assume that such a wild shift is the result of bad faith instead of simple disorganization and sloppiness of rhetoric. This is what happens to a community that conflates “it makes me feel smart” with “it actually communicates the point effectively”.


  • A) At this point I would be more surprised to learn that AI psychosis wasn’t infecting the upper tiers of the white house tbh. Like, at this point we could get a leak that Hegseth had been developing a literal god complex alongside his LLM mistress and I wouldn’t bat an eye.

    B) It seems like a particularly bad sign that this is coming from thr Saudis given that they’ve been a consistent ally that the US has spent a lot of material resources and political capital to support. Ed: not actually an official Saudi government source. When you assume you make an ass of yourself, etc.



  • I doubt they have the individual or institutional capacity to go after them in a timely and competent fashion, but there’s plenty of time before August for someone to remind them about it, especially since this was a way for Anthropic and friends to reclaim some positive space in the news cycle. I can see some bad news for the bubble and/or war hitting in, say, June and causing Amodei to break out the “we stood up to trump” story again, which will in turn remind the dodderer-in-chief that they were gonna try and do something about that guy.


  • Part of what makes the RatFic version of this so weird imo is that despite being ostensibly rooted in relatively low-hanging fruit (e.g. what if we industrialized this pre modern setting, what if we rationally looked at the rules of this magic system, etc.) nobody other than the protagonist has ever thought about these things and even once the protagonist starts demonstrating some real world-conquering results (benevolently, of course) nobody ever really seems to want to copy their successes. Part of what made the actual industrial revolution unfold the way it did was because of the ensuing arms race of it. In addition to causing the lines on various economist’s charts to go nearly vertical this also basically culminated in the first world war, which seems like the kind of event that they should be aware of. But of course in RatFic it seems like anyone who can’t be talked around to joining up with our protagonist is too weak or woke or stupid to actually pose a threat to the Glorious March of Rational Progress.









  • Look, I’ve read some long-ass web novels. I enjoyed Worm, A Practical Guide to Evil, and Katalepsis all start to finish. I have also spent more hours than I could count (even if I did care to) perusing excessively detailed fan wikis and reading interninal debates between nerds about minutia. I have done all of this and enjoyed myself greatly.

    But the way they’re describing this sounds absolutely exhausting and incredibly dull. If this isn’t the result of some kind of collaborative project where the debates are between different actual people then it sounds like you’re just dumping your worldbuilding notes into the page and throwing a “he said” every so often.



  • I mean, giving inflated titles and grandiose plans is part of the sales pitch. Y’know, for the cult.

    Like, I think there’s a fundamental misunderstanding here. The problem isn’t that the people who want to be cult leaders are able to attract a lot of people who are preinclined to be cult followers and those people suffer the associated psychic damages. It’s that even the less culty parts of the rationalist subculture seem to produce a weirdly high number of wannabe cult leaders, even if they don’t conceptualize themselves that way.