Not a member, but thought I’d share this for the artists here in case they haven’t seen the news.

  • RightHandOfIkaros@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    2 years ago

    Only prolonging the inevitable with this. Kinda like DRM in video games, this is going to do literally nothing to the people that want the data except maybe be a minor inconvenience for a month or two.

    Wasn’t the last attempt at this defeated by only 16 lines of Python code?

    • AphoticDev@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      7
      ·
      2 years ago

      If this takes off, it will be bypassed within a month. Adversarial training is something Stable Diffusion users already invented, and we use it to make our artwork better by poisoning the dataset to teach the network what a wrong result looks like. They reinvented our wheel.

  • sillyplasm@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    2 years ago

    I’ve heard of this recently! I like how they use the term “poison” because it makes me imagine that AI non-art is a bunch of evil aristocrats and Nightshade is the cyanide we’re slipping into their beverages.

    • Mx Phibb@reddthat.comOP
      link
      fedilink
      arrow-up
      2
      ·
      2 years ago

      Chuckles, "I like that, but I suspect they’re using calling it poison because of the phrase ‘poisoning the well’