New 'Nightshade' tool poisons AI training data to protect artists' work from unauthorized use
-
A new tool called Nightshade can add invisible pixels to images to protect art from being used without permission by AI image generators.
-
Nightshade was created by researchers at the University of Chicago to help fight back against AI companies using art without artist consent.
-
The tool "poisons" training data so AI models learn to interpret images incorrectly, like seeing a car as a cow.
-
Nightshade will be free and open source so more creators can use it to protect work shared online.
-
If enough artists use Nightshade, it could encourage AI companies to properly credit and compensate original creators.