Microsoft Engineer Warned of AI Safety Issues, but Copilot Released Without Safeguards
-
Microsoft engineer Shane Jones warned that Copilot Designer AI tool creates violent, sexual images, but Microsoft ignored warnings and failed to add safeguards.
-
Jones raised alarms publicly after Microsoft told him to remove critical LinkedIn post, including letters to lawmakers, stakeholders, Microsoft board.
-
Jones told FTC that Microsoft is aware of issues since October but still markets tool as safe for kids.
-
Tool can generate inappropriate sexual images of women, plus content related to politics, drugs, trademarks without filters.
-
Microsoft spokesperson did not confirm adding image filters, only said company policies address employee concerns and feedback improves safety.