England’s Children’s Commissioner is urging immediate government action against AI – powered apps that generate sexually explicit images of minors. Dame Rachel de Souza warns these “nudification” tools – which digitally remove clothing from photos of real children – are spreading unchecked with devastating consequences, particularly targeting young girls.
The disturbing technology has created a climate of hear among children, with many girls now avoiding posting photos online altogether. Recent data reveals a shocking 380% increase in reported cases of AI – generated child sexual abuse material in 2024 compared to last year. While current UK law bans sharing explicit deepfakes, Dame Rachel argues this doesn’t go far enough – calling for a complete prohibition on all apps capable of creating nude images of minors.
Government officials emphasize that creating or processing AI – generated child abuse imagery is already illegal, with plans to introduce additional offenses specifically targeting AI tools designed for this purpose. However, child protection advocates insist more urgent action is needed as the technology rapidly outpaces legislation:
Key concerns include:
- Apps specifically designed to work on female bodies
- Easy accessibility through mainstream platforms
- Use by classmates and peers in school settings
- Lasting psychological harm to victims
The Internet Watch Foundation reports these AI-generated images are increasingly appearing in schools, where they quickly spiral out of control. Education leaders warn the technology is advancing faster than protective measures or student education about its dangers.
As regulators implement new online safety codes requiring stricter age verification for adult content, critics argue these measures prioritize corporate interests over child protection. The debate highlights the urgent challenge of balancing technological innovation with safeguarding vulnerable users in an increasingly digital world.