• Sailor Sega Saturn
    link
    fedilink
    English
    arrow-up
    32
    ·
    edit-2
    1 year ago

    Sloppy LLM programming? Never!

    In completely unrelated news I’ve been staring at this spinner icon for the past five minutes after asking an LLM to output nothing at all:

    • selfA
      link
      fedilink
      English
      arrow-up
      22
      ·
      1 year ago

      same energy as “your request could not be processed due to the following error: Success”

    • earthquake@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 year ago

      What are the chances that the front end was not programmed to handle the LLM returning an empty string?

      • Sailor Sega Saturn
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        Quite likely yeah. There’s no way they don’t have a timeout on the backend.

    • David GerardOPMA
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      boooo Gemini now replies “I’m just a language model, so I can’t help you with that.”

      • froztbyte
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        “what would a reply with no text look like?” or similar?

        • David GerardOPMA
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          what would a reply with no text look like?

          nah it just described what an empty reply might look like in a messaging app

          they seem to have done quite well at making Gemini do mundane responses

          • froztbyte
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 year ago

            that’s a hilarious response (from it). perfectly understand how it got there, and even more laughable