Rise in Suspected AI-Generated Child Sexual Abuse Images Presents New Challenges
• Thousands of reports to NCMEC's CyberTipline last year were about suspected AI-generated child sexual abuse material (CSAM)
• Leading AI companies like OpenAI are now cooperating with NCMEC to track and flag apparent CSAM
• Deepfake nude photos of students are spreading in schools across the US
• It's becoming harder to distinguish fake AI-generated CSAM from real images
• Reports of AI-generated CSAM are expected to grow as the technology advances and spreads