Creators Fight Back Against AI With Invisible Image Changes to Trick Systems
-
Artists are using a new tool called Nightshade to add imperceptible changes to their images that trick AI systems into hallucinating cats, protecting the art from unauthorized use.
-
Nightshade works by altering thousands of pixels in ways that are invisible to humans but look completely different to AI models.
-
The tool can force image generation systems like Stable Diffusion to output cat imagery instead of properly generating a requested image.
-
While Nightshade may not be adopted widely enough to take down AI systems, it allows individual creators some control over how their work gets used.
-
The tool represents a revolt against the lack of meaningful AI regulation, but developers will likely update systems to defend against such image poisoning.