New 'Nightshade' tool lets artists fight back against AI by poisoning their own art
-
Researchers developed a tool called Nightshade that can "poison" AI image generators by altering pixels in artists' work, causing unpredictable/faulty results.
-
Nightshade aims to fight back against unauthorized use of artists' work to train AI models without permission.
-
It works by subtly changing images so AI interprets them as something entirely different than what they actually depict.
-
Poisoning attacks could potentially disrupt/corrupt AI art generators, but could also be misused by bad actors.
-
Authors suggest Nightshade may encourage model trainers to get artist permission, but it also allows artists to sabotage AI using their work without consent.