New 'Nightshade' tool subtly alters images to resist unauthorized AI training
-
Nightshade is a tool that subtly "poisons" images to trick AI models into miscategorizing them. This renders the images useless for training generative AI.
-
The goal is to force tech giants to pay for licensed work rather than scraping images without consent. Nightshade gives artists leverage in demanding compensation.
-
Example images show how Nightshade makes nearly imperceptible changes to human viewers, but AI models see something completely different and nonsensical.
-
Artists like Kelly McKernan are using Nightshade as a protective measure until adequate legal protections exist against unauthorized scraping.
-
The developers say Nightshade is legal and aims to make scraping unlicensed art incrementally more difficult and expensive, not destroy AI altogether.