• @titotal
    link
    English
    4610 months ago

    As a physicist, this quote got me so mad I wrote an excessively detailed debunking a while back. It’s staggeringly wrong.

    • @bitofhope
      link
      English
      1810 months ago

      Kudos for the effortpost. My 5-second simpleton objection went something like

      YEA BECAUSE WEBCAMS COME WITH DENSITY SENSORS INCLUDED RIGHT?

      • David GerardOPMA
        link
        English
        15
        edit-2
        10 months ago

        A Bayesian superintelligence, hooked up to a webcam, would generate the world’s most beautiful camgirl, like a photoshop that’s been photoshopped, and take over OnlyFans to raise money to wedgie rationalists, just don’t look too closely at the fingers or teeth

        • @BrickedKeyboard
          link
          English
          5
          edit-2
          10 months ago

          I’m trying to find the twitter post where someone deepfakes eliezer’s voice into saying full speed ahead on AI development, we need embodied catgirls pronto.

          • David GerardOPMA
            link
            English
            310 months ago

            now that’s a positive contribution to the space

    • David GerardOPMA
      link
      English
      1010 months ago

      that was shockingly polite

    • @TerribleMachines
      link
      English
      910 months ago

      Love this!

      Alas, if Yud took an actual physics class, he wouldn’t be able to use it as the poorly defined magic system for his OC doughnut-steal IRL bayesian superintelligence fanfic.

    • @niktemadur@lemmy.world
      link
      fedilink
      English
      610 months ago

      It’s because of stuff like this and people like you on the Fediverse that leaving Reddit for the final time on that night of June 11… two and a half months already… hasn’t hurt as much as I feared back then.

    • @earthquake@lemm.ee
      link
      fedilink
      English
      610 months ago

      “Eliezer has sometimes made statements that are much stronger than necessary for his larger point, and those statements turn out to be false upon close examination” is something I already generically believe, e.g. see here.

      I get the impression that this guy (whose job at an AGI thinkpiece institute founded by a cryptobillionaire depends on believing this) would say this about ALL of EYs statements, leaving his larger point floating in the air, “supported” by whatever EY statements you aren’t currently looking at.

    • @selfMA
      link
      English
      510 months ago

      this is fantastic! if you’ve ever got another one of these in you, feel free to tag it NSFW and post it here or on MoreWrite depending on what feels right. I live to see yud get destroyed in slow motion by real expertise

      • @blakestaceyMA
        link
        English
        1110 months ago

        I’ve more than once been tempted to write Everything the Sequences Get Wrong about Quantum Mechanics, but the challenge is doing so in a way that doesn’t just amount to teaching a whole course in quantum mechanics. The short-short version is that it’s lazy, superficial takes on top of cult shit — Yud trying to convince the reader that the physics profession is broken and his way is superior.

        • @selfMA
          link
          English
          510 months ago

          I’d be happy to contribute what CS material I can to a multidisciplinary effort to prove that Yud’s lazy, superficial takes and cult shit are universal

          • @blakestaceyMA
            link
            English
            610 months ago

            I got as far as this blog post that I shared in the first days of new!SneerClub, but that was only a first stab.

      • @titotal
        link
        English
        410 months ago

        Yeah, I’ve been writing up critiques for a year or two now, collected over at my substack. I’ve been posting them to the EA forum and even Lesswrong itself and they’ve been generally well received.

    • @FooBarrington@lemmy.world
      link
      fedilink
      English
      310 months ago

      Interesting read, thank you for sharing! You nicely put into words what I thought as well - the amount of information you can deduce from 3 frames of a falling apple is way too limited to do what they describe.

  • @bitofhope
    link
    English
    1610 months ago

    The word “Einstein” appears no less than eight times in this story.

    Bringing up Hendrix every ten sentences doesn’t make you an amazing guitarist either.

  • Steve
    link
    English
    1110 months ago

    “Imagine a world much like this one”

    I don’t know how you all muster the focus to read past these hallmarks of shit ideas

  • @gerikson
    link
    English
    1110 months ago

    So LW is just a fanfic appreciation forum, got it.

  • @selfMA
    link
    English
    1110 months ago

    But in this world there are careful thinkers, of great prestige as well, and they are not so sure. “There are easier ways to send a message,” they post to their blogs

    please destroy this gate address, it leads to Reply Guy Earth

    • @selfMA
      link
      English
      1410 months ago

      also, sincerely, can anyone explain to me what’s good about Yud’s writing? this shit is structured exactly like a goosebumps short except instead of being written by a likeable author targeting grade schoolers it’s written by some asshole who loves using concepts he doesn’t understand, targeting other assholes who don’t understand fucking anything because all their knowledge got filtered through Yud

      • @froztbyte
        link
        English
        810 months ago

        I don’t think there’s anything good about the writing, but there’s a few things that stand out ito mechanics employed and to which outcome effect they appear to be aiming

        • (bad) storyteller style (nerds love 'em some stories as much as the next, even those who think they don’t)
        • touching on sufficiently many topics (“oh wow he’s thought about this so hard”
        • going just far enough in detail to convince that there’s some kind of deeper aspect/more (“wow he knows so much about this”)

        even this horrible essay pulled the infomercial “but wait, there’s more!” at least 5 times. a terrible Plot Twist because he can’t figure out how to layer his story devices any better

        • David GerardOPMA
          link
          English
          8
          edit-2
          10 months ago

          feel free!

          edit: I read through the sequences three times: once on the site, once as an epub and once reading every post on LW main from 2007-2011 in order of posting. I can state that I have Done The Fucking Reading. The sequences finished in 2009, then you can see the site get weirder as people riff off them, up to the basilisk post in mid-2010. At that point everyone noticeably cools it on the weirdness and the site’s haunted by a post nobody will talk out loud about. Then HPMOR takes off and the site has a new recruiting point.

          • @Evinceo
            link
            English
            1010 months ago

            read through the sequences three times

            Why would you do this

            • David GerardOPMA
              link
              English
              910 months ago

              do you think i’d be here if i had good judgement

        • @selfMA
          link
          English
          710 months ago

          oh absolutely, and check out the ridiculous amount of ideological priming Yud does in this post. one example:

          (Oh, and every time someone in this world tries to build a really powerful AI, the computing hardware spontaneously melts. This isn’t really important to the story, but I need to postulate this in order to have human people sticking around, in the flesh, for seventy years.)

          (and it’s very funny to me that a number of comments are “oh I had no idea this was about AI until the end…!”, how young are these kids you’re programming, Yud?)

          in general, the ridiculous amount of slog going in combined with regular priming reminds me a lot of another sci-fi flavored cult I know, if you get my meaning

        • @froztbyte
          link
          English
          510 months ago

          oh yeah the complexity and effort is almost certainly one of the points - people don’t like to admit they got swindled or wasted their time, and ostensibly-clever people are just as capable of falling victim to this as others

  • @sus@programming.dev
    link
    fedilink
    English
    10
    edit-2
    10 months ago

    Never underestimate the rationalist’s ability to write a 5000 word, extremely fanciful short story to make a point that could be compressed into 2 sentences, in a failed attempt to dismiss a strawman

    and of course the story includes a cameo from the writer’s opponents who are naive fools and proceed to doom the universe with their hubris (of disagreeing with the author)

  • @corbin
    link
    English
    810 months ago

    He thinks he’s discount Peter Watts.

    • @froztbyte
      link
      English
      710 months ago

      he probably doesn’t think it’s all that discount, but this comment is even funnier given how much Watts was responsible for watering down/clobbering some of the meaning of things

  • @killeronthecorner@lemmy.world
    link
    fedilink
    English
    610 months ago

    Albert Einstein, himself, still lived and still made approximately the same discoveries, but his work no longer seems exceptional

    It’s hard to grasp how grotesquely stupid a sentence this is.

  • @jonhendry
    link
    English
    510 months ago

    What if the webcam were upside down.

  • @saucerwizard
    link
    English
    410 months ago

    I hate the co-option of SETI these guys try.

  • @carlitoscohones
    link
    English
    410 months ago

    I am saving this to read later, but having eugenics and IQ in the very first sentence makes it look especially promising.