AI-generated sexualised images of children are a form of child sexual abuse and must be criminalised, UN agency said, warning of a rapid and alarming rise in the misuse of AI tools to create abusive content.
In a statement, the United Nations International Children’s Emergency Fund, UNICEF, responsible for providing humanitarian and developmental aid to children worldwide, said it was increasingly concerned by reports showing a surge in AI-generated sexualised images, including cases where real photographs of children are manipulated and sensualised through deepfake technology.
UNICEF said deepfakes are being used to produce sexualised content involving children where AI tools digitally remove or alter clothing to fabricate nude or sexual images.
The UN agency called on governments worldwide to expand legal definitions of child sexual abuse material to explicitly include AI-generated content and to criminalise its creation, possession, procurement and distribution.
The agency also urged AI developers to implement robust safeguards and digital platforms to prevent the circulation of such material, rather than removing it only after abuse has occurred. It said stronger content moderation and investment in detection technologies were essential to ensure immediate removal of abusive material.







