• Pornographic deepfake images of Taylor Swift are spreading online, with her fans working to get them removed and flood sites with more positive images.
• Researchers tracked dozens of explicit AI-generated images of Swift on platforms like X and Facebook.
• These fake images disproportionately target and harm women, research shows. Swift has mobilized her fanbase against wrongdoings before.
• Sites like X say they don't allow nonconsensual nude images and are working to remove the content, but have limited moderation.
• Lawmakers point to the incident to show why better protections against deepfakes are needed, as their impacts are real even if images are fake.