Microsoft Employee Alleges Copilot Designer AI Produces Harmful Images, Despite Company Claims of Safety
-
Microsoft employee Shane Jones wrote a letter to the FTC about his concerns with Copilot Designer, Microsoft's AI image creator. He says it produces "harmful content" reflecting sex, violence, bias, etc.
-
Jones claimed Microsoft denied his requests over 3 months to remove Copilot Designer from public use until better safeguards could be put in place.
-
The tool can add "harmful content" to images even when prompted with benign terms, like creating sexualized images from the term "car accident," according to Jones.
-
Jones wrote that while Microsoft markets Copilot Designer as safe for anyone, even kids, internally the company knows of issues with it creating offensive, inappropriate images.
-
This comes after Jones previously posted open letters urging OpenAI and lawmakers to address public safety risks of AI image generators, before Microsoft made him delete the posts.