• @wndy
    link
    English
    322 days ago

    Like, it’s not even THAT much better. I mean, not so much so that everyone should flood it lmao. The main plus was no restriction on tokens used, but that’s useless when it’s getting overloaded all the time.

    I would say it’s just barely noticeably better than the free tier of GPT. Which makes it a little annoying to go back but w/e.

    • @Evinceo
      link
      English
      221 days ago

      Can’t people run it locally supposedly?

      • @wndy
        link
        English
        4
        edit-2
        21 days ago

        Not people who can’t afford 100k to spin up their own servers. It’s going to be a game changer for AI startups and such though who won’t have to spend as much as previously thought.

        edit: Basically, numbers out of my ass, but it’s like they reduced the amount you have to spend to get chatgpt-level output from $500k to $100k. Amazing and all, definitely newsworthy, but uh… not directly relevant for us little folk, more about the ripple effects

      • @Architeuthis
        link
        English
        321 days ago

        The 671B model although ‘open sourced’ is a 400+GB download and is definitely not runnable on household hardware.

          • @swlabr
            link
            English
            721 days ago

            What? You don’t have a spare $6,000 to run nonsense generators at home?

            Really sorry about this but this sounds like the premise to a shitty boomer joke:

            “If I wanted to spend another 6000 for a home nonsense generator, I’d get married again!”

            etc.