• @sturlabragason@lemmy.world
    link
    fedilink
    English
    -2
    edit-2
    1 month ago

    You can download multiple LLM models yourself and run them locally. It’s relatively straightforward;

    https://ollama.com/

    Then you can switch off your network after download, wireshark the shit out of it, run it behind a proxy, etc.

    • @froztbyte
      link
      English
      81 month ago

      you didn’t need to give random llms free advertising to make your point, y’know