New tool 'poisons' AI to protect artists' work from unauthorized copying
• New software called Nightshade helps artists protect images from AI copying
• Nightshade can "poison" AI models by feeding them misleading data about images
• It builds on previous tool Glazer which subtly modifies images to confuse AI models
• Goal is to disrupt AI training, force licensing of data, protect artists' rights
• Unclear if copying art for AI training is legal; lawsuits underway as artists fight back