• @BlueMonday1984
    link
    English
    134 months ago

    Just an off-the-cuff prediction: I fully anticipate AI bros are gonna put their full focus on local models post-bubble, for two main reasons:

    1. Power efficiency - whilst local models are hardly power-sippers, they don’t require the planet-killing money-burning server farms that the likes of ChatGPT require (and which have helped define AI’s public image, now that I think about it). As such, they won’t need VC billions to keep them going - just some dipshit with cash to spare and a GPU to abuse (and there’s plenty of those out in the wild).

    2. Freedom/Control - Compared to ChatGPT, DALL-E, et al, which are pretty locked down in an attempt to keep users from embarrassing their parent corps or inviting public scrutiny, any local model will answer whatever dumbshit question you ask for make whatever godawful slop you want, no questions asked, no prompt injection/jailbreaking needed. For the kind of weird TESCREAL nerd which AI attracts, the benefits are somewhat obvious.

    • @vrighter@discuss.tchncs.de
      link
      fedilink
      English
      94 months ago

      you almost always get better efficiency at scale. If the same work is done by lots of different machines instead of one datacenter, they’d be using more energy overall. You’d be doing the same work, but not on chips specifically designed for the task. If it’s already really inefficient at scale, then you’re just sol.

    • CubitOom
      link
      fedilink
      English
      44 months ago

      I guess it depends how you define what an “ai bro” is. I would define them as the front men of startups with VC funding who like to use big buzz words and will try to milk as much money as they can.

      These types of people don’t care about power efficiency or freedom at all unless they can profit off of it.

      But if you just mean anyone that uses a model at home then yeah you might be right. But I’m not understanding all the harsh wording around someone running a model locally.