The New Yorker has a piece on the Bay Area AI doomer and e/acc scenes.

Excerpts:

[Katja] Grace used to work for Eliezer Yudkowsky, a bearded guy with a fedora, a petulant demeanor, and a p(doom) of ninety-nine per cent. Raised in Chicago as an Orthodox Jew, he dropped out of school after eighth grade, taught himself calculus and atheism, started blogging, and, in the early two-thousands, made his way to the Bay Area. His best-known works include “Harry Potter and the Methods of Rationality,” a piece of fan fiction running to more than six hundred thousand words, and “The Sequences,” a gargantuan series of essays about how to sharpen one’s thinking.

[…]

A guest brought up Scott Alexander, one of the scene’s microcelebrities, who is often invoked mononymically. “I assume you read Scott’s post yesterday?” the guest asked [Katja] Grace, referring to an essay about “major AI safety advances,” among other things. “He was truly in top form.”

Grace looked sheepish. “Scott and I are dating,” she said—intermittently, nonexclusively—“but that doesn’t mean I always remember to read his stuff.”

[…]

“The same people cycle between selling AGI utopia and doom,” Timnit Gebru, a former Google computer scientist and now a critic of the industry, told me. “They are all endowed and funded by the tech billionaires who build all the systems we’re supposed to be worried about making us extinct.”

  • @Architeuthis
    link
    English
    339 months ago

    This was such a chore to read, it’s basically quirk-washing TREACLES. This is like a major publication deciding to take an uncritical look at scientology focusing on the positive vibes and the camaraderie, while stark in the middle of operation snow white, which in fact I bet happened a lot at the time.

    The doomer scene may or may not be a delusional bubble—we’ll find out in a few years

    Fuck off.

    The doomers are aware that some of their beliefs sound weird, but mere weirdness, to a rationalist, is neither here nor there. MacAskill, the Oxford philosopher, encourages his followers to be “moral weirdos,” people who may be spurned by their contemporaries but vindicated by future historians. Many of the A.I. doomers I met described themselves, neutrally or positively, as “weirdos,” “nerds,” or “weird nerds.” Some of them, true to form, have tried to reduce their own weirdness to an equation. “You have a set amount of ‘weirdness points,’ ” a canonical post advises. “Spend them wisely.”

    The weirdness is eugenics and the repugnant conclusion, and abusing bayes rule to sidestep context and take epistimological shortcuts to cuckoo conclusions while fortifying a bubble of accepted truths that are strangely amenable to allowing rich people to do whatever the hell they want.

    Writing a 7-8000 word insider expose on TREACLES without mentioning eugenics even once throughout should be all but impossible, yet here we are.

    • @swlabr
      link
      English
      13
      edit-2
      9 months ago

      quirk-washing TREACLES

      I can’t wait to be quirk-washed, I’m ready to hang up my pick-me hat and let the new yorker do the work for me

        • @TinyTimmyTokyoOP
          link
          English
          109 months ago

          I’m probably not saying anything you didn’t already know, but Vox’s “Future Perfect” section, of which this article is a part, was explicitly founded as a booster for effective altruism. They’ve also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it’s just taken for granted that he’s some kind of visionary genius.

          • @froztbyte
            link
            English
            39 months ago

            yep aware. didn’t care too much about the article itself, was more observing the coincidence in timing. but you have a point there with the names, I really should make that a standing mental ban

        • @swlabr
          link
          English
          59 months ago

          Had to stop reading that. My eyes were rolling too much.

          • @froztbyte
            link
            English
            79 months ago

            uwu smol-bean number starers, lovable little group of misfits from checks notes fucking RAND

        • @gerikson
          link
          English
          39 months ago

          What happened to Samotsvety last year? I missed that .

          • @froztbyte
            link
            English
            59 months ago

            I meant more the general state of the things in the TREACLES umbrella catching unfavourable public attention over the last while

      • @sc_griffith
        link
        English
        79 months ago

        you gotta be white cis and loathsome or they won’t do it

    • @Amoeba_Girl
      link
      English
      129 months ago

      God I always forget about the repugnant conclusion. It’s baffling that it’s being taken as anything but a fatal indictment of utilitarianism.

    • @Architeuthis
      link
      English
      149 months ago

      Yeah, a lot of these TESCREAL exposés seem to lean on the perceived quirkiness while completely failing to convey how deeply unserious their purported scientific and philosophical footing is, like virgin tzatziki with impossible gyros unserious.

  • @gerikson
    link
    English
    129 months ago

    Wat

    [Grace’s] grandfather, a British scientist at GlaxoSmithKline, found that poppy seeds yielded less opium when they grew in the English rain, so he set up an industrial poppy farm in sunny Australia and brought his family there.

    To grow opium???

    (OK I guess for medicinal purposes but maybe point that out)

    • @blakestaceyMA
      link
      English
      129 months ago

      We should have known the English rain was trouble when it started giving people tans

    • @Architeuthis
      link
      English
      79 months ago

      I wonder how much of that family fortune has found its way into EA coffers by now.

      • @gerikson
        link
        English
        79 months ago

        In another part of the article, it states that Grace grew up “semi-feral”, so perhaps the fortune was smoked away in the Tasmanian opium dens (those exist, right?)

        • @Architeuthis
          link
          English
          12
          edit-2
          9 months ago

          In yet another part of the article:

          She had found herself in both an intellectual community and a demimonde, with a running list of inside jokes and in-group norms. Some people gave away their savings, assuming that, within a few years, money would be useless or everyone on Earth would be dead.

          More totally normal things in our definitely not a cult community.

  • @Amoeba_Girl
    link
    English
    119 months ago

    Oh, good, ex-incel Scott is in a polycule now, the wonders of the cult lifestyle.

    • @Architeuthis
      link
      English
      99 months ago

      Wasn’t he supposed to be a romantic asexual at some point?

      • @saucerwizard
        link
        English
        89 months ago

        After all I’ve heard I believe that was a bald faced lie.

        • @Architeuthis
          link
          English
          79 months ago

          Maybe he’s the guy who goes to the orgy just to hold hands.

          • @saucerwizard
            link
            English
            59 months ago

            That he was hooking up with dudes at rationalist meet ups.

  • @saucerwizard
    link
    English
    119 months ago

    I have a bad feeling these people are going to waltz into even more power.

    • @gerikson
      link
      English
      99 months ago

      I dunno. At least in the US, these people are decidedly outside the mainstream, at least in the US. Their views on religion and sexual mores preclude any popular appeal, and they are handicapped in a similar way were they to try to infiltrate existing power structures.

      Basically their only hope is that an AI under their control takes over the world.

      • @Architeuthis
        link
        English
        119 months ago

        Basically their only hope is that an AI under their control takes over the world.

        They are pretty dominant in the LLM space and are already having their people fast tracked into positions of influence, while sinking tons of cash into normalizing their views and enforcing their terminology.

        Even though they aren’t trying to pander to religious americans explicitly, their millenialism with the serial numbers filed off worldview will probably feel familiar and cozy to them.

      • @YouKnowWhoTheFuckIAM
        link
        English
        8
        edit-2
        9 months ago

        Come on, you’re talking about America, when did mainstream popular appeal ever limit anyone with money?

        • @gerikson
          link
          English
          99 months ago

          You still need to lever that money by “buying” the people in power.

          Right now there’s really no mainstream politicians 100% on board with the weirdness of TESCREALs:

          • mainstream Democrats - too wary of corporations, too eager to regulate
          • pre-Trump GOP - maybe, but they’re losing influence fast
          • current Trump GOP - literally crazy, way too easy for TESCREALs to be painted as a satanic cult
          • @lobotomy42
            link
            English
            58 months ago

            Maybe. The current EA strategy is to takeover all the technocratic positions in government/business one level down from the ostensible policy-makers. The idea being that if they are the only ones qualified to actually write the reports on “alignment” for DoD/NIST/etc. then ultimately they get to squeeze in some final control over the language, regardless of what Joe Senator wants. Similarly, by monopolizing and brainwashing all the think tank positions, even the Joe Senators out there end up leaning on them to write the bills and executive orders.

          • @YouKnowWhoTheFuckIAM
            link
            English
            48 months ago

            I’ve finally got around to replying to this but it’s been burning a hole in my subconscious

            I think that’s a naive interpretation of the interests in play here.

            Altman aptly demonstrated that a yes/no on regulations isn’t the money’s goal here, the goal is to control how things get regulated. But at the same time Democrats are hardly “eager to regulate” simpliciter, and the TESCREALs/Silicon Valley can hardly be said to have felt the hammer come down in the past. It may be part of some players’ rhetoric (e.g. Peter Thiel) that the Republicans (both pre- and post-Trump) are their real friends insofar as the Republicans are eager to just throw out corporate regulations entirely, but that’s a different issue: it’s no longer one of whether you can buy influence, it’s a matter of who you choose to buy influence with in the government, or better yet which government you try to put in power.

            It should be noted at this point that mentioning Thiel is hardly out of court, even if he’s not in the LessWrong stream: he shares goals and spaces with big elements of the general TESCREAL stream. He’s put money into Moldbug’s neo-reaction, which is ultimately what puts Nick Land sufficiently on the radar to find his way into Marc Andreesen’s ludicrous manifesto.

            And why should the TESCREALs fear being painted as a satanic cult in the first place? Has that been a problem for anybody but queer people and schoolteachers up to this point? It seems unlikely to me that anyone involved in Open AI or Anthropic is going to just stop spending their absolute oceans of capital for fear that LibsOfTikTok is going to throw the spotlight on them. And why would Raichik do that in the first place? The witch hunters aren’t looking for actual witches, they’re looking for political targets, and I don’t see what’s in it for them in going after some of the wealthiest people on the West Coast except in the most abstract “West Coast elites” fashion, which as we all know is just another way of targeting liberals and queers.

  • @BigMuffin69
    link
    English
    109 months ago

    he dropped out of school after eighth grade, taught himself calculus

    Lmaou, gonna need a citation on this one chief. This the same guy who said we need people monitoring for ‘sudden drops’ in the loss function? I’m supposed to believe this poser understands what a derivative is now?

  • @lobotomy42
    link
    English
    78 months ago

    “Socialists think we’re sociopathic Randroid money-obsessed Silicon Valley hypercapitalists.”

    No, Scott, we just think you’re a coward and a racist

  • @lobotomy42
    link
    English
    58 months ago

    Life is weird when you’re living in a racist polycule