In today’s episode, Yud tries to predict the future of computer science.

  • dr2chase
    link
    fedilink
    48 months ago

    @corbin I got a 96GB laptop just so I could run (some) LLMs w/o network access, I’m sure that will be standard by 2025.🤪

    • @corbinOP
      link
      English
      88 months ago

      Let me know @corbin@defcon.social if you actually get LLMs to produce useful code locally. I’ve done maybe four or five experiments and they’ve all been grand disappointments. This is probably because I’m not asking questions easily answered by Stack Overflow or existing GitHub projects; LLMs can really only model the trite, not the novel.