New 'Nightshade' tool poisoned by artists over 250K times to fight AI art theft
• A new free tool called Nightshade that helps artists "poison" AI models trained on their images without consent has been downloaded over 250,000 times in 5 days, far exceeding the creators' expectations.
• Nightshade "shades" images to make them appear different, causing AI-generated images to be flawed and affecting how machine learning algorithms view them.
• It was created by researchers at the University of Chicago as a sister product to Glaze, which seeks to cause AI models to misidentify art styles.
• The team recommends artists shade images first with Nightshade, then use Glaze for additional protection, though this may increase visible artifacts.
• Some artists suggest poisoning everything posted online to protect their content, and platforms like Cara plan to integrate with Nightshade.