Google Pauses AI Image Program After Inaccurate and Offensive Outputs
-
Google halted its Gemini AI image feature from generating pictures of people due to inaccurate responses to prompts, including historically inaccurate images.
-
The halt came after published screenshots showed the program creating images like people of color in Nazi uniforms when prompted to "generate an image of a 1943 German Soldier."
-
A purported Google employee posted an example of an inaccurate image on X, saying "I’ve never been so embarrassed to work for a company."
-
In a blog post, Google's Prabhakar Raghavan said the program was designed to avoid "traps" but didn't account for "cases that should clearly not show a range."
-
The halt follows other AI image controversies, including fake explicit images of Taylor Swift circulating online and AI-generated Biden deepfakes used in campaign calls.