AI Struggles to Accurately and Sensitively Depict Marginalized Groups
-
AI models like Dall-E 3 and Google's Gemini alter image prompts to be more inclusive, but can overcorrect leading to inaccurate depictions (e.g. Black Nazis)
-
AI relies on stereotypes and past data which limits its ability to capture the fluidity of human identity, concerning for marginalized groups
-
OpenAI's text-to-video tool Sora produced messy but impressive videos of queer people, though struggled with nonbinary representations
-
The AI-generated videos lacked diversity within the queer community (age, body type, disability)
-
Even if representations improve, AI synthesized depictions of marginalized groups could still have unintended consequences by taking control of portrayal from the communities themselves