Researchers Develop New Tool to Help Protect Visual Artists' Work from Unauthorized AI Use
-
University of Chicago researchers developed Nightshade 1.0, a tool to poison images so AI models that use them without permission will produce unpredictable/useless outputs.
-
Nightshade makes subtle visible changes to images that human eyes may not notice, but AI models will interpret very differently.
-
It's intended to punish model makers who train on data without getting content creators' permission, respecting copyright.
-
Nightshade has some limitations - it may subtly alter artwork, mainly with flat colors/smooth backgrounds, and its impact could be undone over time.
-
The same researchers previously developed Glaze, which alters images to prevent AI models from replicating artists' visual styles without consent. They recommend using both tools.