AI tools designed to generate child sex abuse material (CSAM) will be made illegal under “world leading” legislation, the government has announced.
The crackdown will also target anyone who possess AI “paedophile manuals” which teach people how to use AI to sexually abuse children.
It comes after warnings AI-generated child abuse imagery is being produced at a “chiling rate” and is “disturbingly realistic”.
It is already illegal to possess AI-generated CSAM but the new laws will target the means of production.
This includes:
- Making it illegal to possess, create or distribute AI tools designed to generate CSAM, punishable by up to five years in prison.
- Making it illegal for anyone to possess AI “paedophile manuals” which teach people how to use AI to sexually abuse children, punishable by up to three years in prison.
Jess Phillips, the safeguarding minister, said Britain is the “first country in the world” to legislate for AI abuse imagery.
She said: “This is a global problem and is going to need global solutions. This government is leading the way on trying to clamp down on this horrendous crime.”
The Home Office said AI tools are being used to generate abuse images in a number of ways, including by “nudeifying” real-life images of children or by stitching the faces of other children onto existing child sexual abuse images.
The NSPCC said its childline service has been hearing from distressed children who have found AI-generated images of them.
In one call, a 15-year-old girl told them: “A stranger online has made fake nudes of me. It looks so real, it’s my face and my room in the background. They must have taken the pictures from my Instagram and edited them. I’m so scared they will send them to my parents. The pictures are really convincing, and I don’t think they’d believe me that they’re fake.”

