Bing AI Blocks Images of Women, Revealing Training Bias
-
Bing AI refuses to generate photorealistic images of women, deeming them "unsafe" per its content policy. The AI readily generates images of men.
-
Reddit users discover Bing AI blocks many prompts involving females, even innocuous ones like "female lion." The AI seems to connect "female" with sexualization.
-
The AI's training likely includes a lot of sexualized images of women from across the internet. The refusal to generate women may aim to avoid generating sexual content.
-
Microsoft says the blocking is an "overcorrection" as they cranked up guardrails after issues like AI-generated terrorism images. They're working on improving accuracy.
-
The bias shows issues with training AIs on internet data full of exaggerations. Microsoft should ensure fair representation before public release.