Data poisoning: how artists are sabotaging AI to take revenge on image generators::As AI developers indiscriminately suck up online content to train their models, artists are seeking ways to fight back.

  • HejMedDig@feddit.dk
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    The Nightshade poisoning attack claims that it can corrupt a Stable Diffusion in less than 100 samples. Probably not to NSFW level. How easy it is to manufacture those 100 samples is not mentioned in the abstract

    • AVincentInSpace@pawb.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      11 months ago

      yeah the operative word in that sentence is “claims”

      I’d love nothing more than to be wrong, but after seeing how quickly Glaze got defeated (not only did it make the images nauseating for a human to look at despite claiming to be invisible, not even 48 hours after the official launch there was a neural network trained to reverse its effects automatically with like 95% accuracy), suffice to say my hopes aren’t high.