AI-generated child sex imagery has every US attorney general calling for action
-
All 50 US state attorneys general sent a letter to Congress raising concerns about AI being used to generate child sexual abuse material.
-
They are calling for an expert commission to study the issue and for expanding laws against child sexual abuse material to cover AI-generated content.
-
Open source AI models like Stable Diffusion make it easy for anyone to generate explicit AI imagery, including of children.
-
These models can be run locally with no safeguards, alarming prosecutors about the potential for abuse.
-
The AGs worry AI tools can create deepfakes overlaying kids' faces onto abuse images to generate new child sexual abuse material.