Like the how many r’s in strawberry. It took off as an Internet meme and was fixed, but how did that fix happen?

  • foggy@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    4 months ago

    The how many rs in strawberry breaks it because it doesnt read your question. It tokenizes it. So it sees (straw)(berry) except it’s more like (477389583)(84838582) and knows contextually that when those two tokens follow like that it means a different set of things that if there were white space.

    The tokens have, basically, numeric value. So it doesn’t read your characters. That’s why that’s hard for it.

    Ideas that recurse in themselves tend to fail as well. i.e. “say banana 142 times” will not produce the expected result.

    As to how they fix them I’m not positive. There’s a bunch of ways to work around issues like these.

    • liquefy4931@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      More people need to understand that this is how LLMs function. There is too much belief that these algorithms are actually thinking and reasoning.