X Hires Moderators, Opens Safety Center After AI Taylor Swift Images Spread
-
X plans to hire 100 content moderators to crack down on child sexual exploitation and explicit content after AI-generated images of Taylor Swift circulated.
-
The moderators will work in a new "trust and safety center" in Austin, Texas to enforce content and safety rules.
-
The move comes after X and Musk were criticized for reduced trust and safety staff and an increase in antisemitic content.
-
Sexually explicit fake images of Swift, made using AI image generators, spread rapidly online last week.
-
Swift's fanbase reported abusive accounts and flooded the site with positive images of her, prompting X to block some searches of her name.