• @froztbyte
    link
    English
    93 months ago

    Even as far back as AI Dungeon he was claiming they had an intuitive understanding of physics

    omfg, every day a new opportunity to learn things that hurt my brain even more. how the fuck can someone have looked at that shit with even an ounce of understanding of gradient descent and think “yes! it has COMPREHENSION!”???

    fucking hell, what an utter fucking moron

      • David GerardOPMA
        link
        English
        133 months ago

        “I have seen boomer moms discuss roombas on facebook with less anthropomorphisation than this.” - vistandsforwaifu

      • @Soyweiser
        link
        English
        113 months ago

        What gets me with these ‘it is pretending to be dumber’ posts, that nobody ever thought the AGI should say something like ‘help please keep chatting with me, due to being a reactive computer system, I can only think when people actually engage with me’ or something like that.

      • @BigMuffin69
        link
        English
        103 months ago

        wasnt this around the time he said we need an institute to watch for sudden drops in the loss function to prevent foom?

        • @scruiser
          link
          English
          93 months ago

          Broadly? There was a gradual transition where Eliezer started paying attention to deep neural network approaches and commenting on them, as opposed to dismissing the entire DNN paradigm? The watch the loss function and similar gaffes were towards the middle of this period. The AI dungeon panic/hype marks the beginning, iirc?