cross-posted from: https://lemmy.zip/post/54658106

“AI brainrot is bad for our souls” An interesting article that explores why games are increasingly finding themselves in situations where AI art was used, ether intentional or not.

  • lime!@feddit.nu
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    3 days ago

    this makes me wonder what hoppens when the bubble does burst, because the tooling will stick around. when the next-next gen gpus all have this stuff optimised to shit as a matter of course (it’s all vector processing, that’s good for graphics too), and a studio can train an image generation model on only their own stuff as part of their normal rendering pipeline rather than spending extra energy, and there’s actual qa for the results so the “look” isn’t there, will there still be complaints?

    part of me thinks there will.

    • Jared White ✌️ [HWC]@humansare.social
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      I saw an “AI” tool recently which showed how you could create two very different poses for a character, and it would “tween” between the two in a realistic, convincing way. It could be described as “genAI” I suppose, but the company claimed they were very specific with how they trained the model and what it was intended to do.

      There were still animators upset about it, and I get it. I’d probably be upset about it if I were in that profession. I’m certainly upset about LLM use in programming. But if I squint really hard, I can barely eke out a vision of limited, targeted, vetted tools which accomplish very specific aids to creators in their professional workflows.

      That is not by and large how any of the services we regularly hear about are built and marketed. There’s a wide gulf between ethically-sourced, limited professional workflow tools and the Copilots & Soras & Sunos of the world. I would say as a general rule, if something is produced based on a “prompt”, it should be immediately viewed with immense suspicion.

      • lime!@feddit.nu
        link
        fedilink
        arrow-up
        4
        ·
        3 days ago

        i saw a tool like that in like 1999. like, not super-realistic but convincing enough. above animorphs level at least.

        idk about immediately throwing out prompts either. i did a big half-rant about this the other day but the current gen of models are all vector spaces. they are basically multidimensional topographical maps with the prompt as the starting coordinates and some traversal algorithm as the means of producing output. as long as they stick around the prompts are needed.

        • wizardbeard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 days ago

          Beyond that, there’s definite value in being able to create queries using natural language, and not needing to always know the specific technical terminology of something while still being able to get pointed in the right direction.

          It’s the whole non-deterministically regurgitate a poor quality combination of all your stolen training data while not citing sources and using absurd resources that I have a problem with, personally.

    • its_kim_love@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      6
      ·
      3 days ago

      Those are the surface level issues. The majority of haters will probably be silenced by a perceived de-escalation, but the police state will still grow. The military industrial complex will make their autonomous death machines. The tech feudalists will pool power.

    • very_well_lost@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      Assuming Nvidia manages to survive the burst and pivots into consumer-grade AI hardware (rather than $50,000 a pop datacenter GPUs), then I suspect a lot of companies and rich hobbyists will buy and use specialized GPUs for art. Meanwhile starving artists won’t be able to afford the $1,000+ GPUs, so that will put more bargaining power in the hands of employers and they’ll use AI as an excuse to drive their wages down further. Of course artists are already some of the most underpaid jobs out there so I don’t think there will be much of a push to replace them completely — just to pay them even less while expecting them to do more than before.

      I doubt it will ever be cost-effective to completely remove human artists from the pipeline because I doubt it’ll ever be possible to produce AI art that isn’t recognizable enough to produce backlash. AI will still be used for cheap shovelware, sure, but serious projects will probably just use it for rapid prototyping and continue to use (underpaid) humans for the final production art while pretending they’re work is less valuable than ever.

      I suspect it will be a similar story for software engineers, too. Even if it costs an employer 50k every few years to maintain a rack of GPUs in a server closet somewhere, that’s still cheaper than paying a mid-level engineer, so they’ll cut as many devs as they can and ask the remaining ones to use the AI tools to do more with less. Wages will probably stagnate in this sector as well as a result (for the devs who aren’t laid off).

      • lime!@feddit.nu
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        nvidia will most likely survive. they didn’t get big from ai, they got big from gaming and then exploded in size. they’re selling shovels in a gold rush, and shovels are useful for other things.

        as for quality, there are some damn good models out there built by amateurs. you just don’t see them in the mainstream because they’re even harder to credit than the normal piles of stolen assets. which of course means even less reason for execs to keep artists around, unless they know the tools. prompting is such a small part of actually getting good results. or rather, good enough.

        the software angle brings up an interesting point because in my sector there is no ai use at all. letting ai do embedded work isn’t really possible right now because the dataset is too small to train on. there are niches of art like this too.

        • very_well_lost@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 days ago

          the software angle brings up an interesting point because in my sector there is no ai use at all. letting ai do embedded work isn’t really possible right now because the dataset is too small to train on. there are niches of art like this too.

          My sector is web development, and there’s a TON of AI exposure there right now (which is not surprising since web companies and web developers alike have always been some of the most ‘trend-chasey’ in tech). I’m not all that confident I can predict how software development will change, but right now my gut tells me that the industry will both shrink and “stratify” into a couple of separate classes.

          One class will be analogous to the “senior engineer” of today. This will be a role you’ll go to college to learn, and you’ll continue to get prestige pay and benefits for doing it. There won’t be many of these positions available and they will either work creating new AI systems, work on systems where AI code generation is impossible, or they’ll be supervising the other class of developers.

          This other class will be more akin to a skilled “trade” like plumber or electrician — both in terms of compensation and in education required. These workers will mostly use AI to produce code that’s substandard but cheap, and a senior engineer will do the required clean-up to make it production-ready. Unlike today, these junior devs won’t have a promotion track up to the senior level. The relationship will be more akin to the relationship between doctors and nurses, I think. There will be more of these positions available (but still fewer than that number of junior dev jobs today), the pay will be lower and the turnover will probably be higher since they’ll get all the stress of crunch and mismanagement without any of the perks the seniors enjoy. They’ll also be viewed and treated as disposable by management. I think there will be a lot of cross pollination between this group and QA, which will experience a lot of the same issues. In fact, this role and QA may just merge into one.

          But I’m also just a random guy talking out of his ass on the Internet, so who fucking knows.