AI-generated images of child sexual abuse uses real victims as reference material

AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.

The Internet Watch Foundation (IWF), who made the discovery, used the example of a girl named Olivia, not her real name, who was the victim of torture and rape between the ages of three and eight.

Olivia, now in her twenties, was rescued by police in 2023, but years later dark web users are using AI tools to computer-generated images of her in new abusive situations after her abuser posted videos and footage of her online.

Paedophiles have recently been compiling collections of images of named victims, such as Olivia, and using them to fine-tune AI models to create new material, the IWF said.

Although the possession of even AI generated child sex images is illegal, the tools used to generate them are both legal and free to use.

if(window.adverts) { adverts.addToArray({"pos": "mpu_mobile_top"}) } if(window.adverts) { adverts.addToArray({"pos": "mpu_tablet"}) }

Some of the deepfake videos feature adult pornography which is altered to show a child’s face. Others are existing videos of child sexual abuse which have had another child’s face superimposed.

And due to the fact they are based on real images, IWL found 90% of the images were realistic enough to be assessed under the same law as actual child sexual abuse material.

Other AI models have also been used to generate sexualised images of celebrity children, the report finds.

The IWF warned ‘hundreds of images can be spewed out at the click of a button” and some have a “near flawless, photo-realistic quality’

The report also underscores how fast the technology is improving in its ability to generate fully synthetic AI videos of CSAM. One ‘shocking’ 18-second fully AI-generated video, found by IWF analysts on the clear web, shows an adult male raping a girl who appears about 10 years old. The video flickers and glitches but the abuse pictured in the clip is described by analysts as ‘clear and continuous’.

While these types of videos are not yet sophisticated enough to pass for real videos of child sexual abuse, analysts say this is the ‘worst’ that fully synthetic video will ever be. Advances in AI will soon render more lifelike videos in the same way that still images have become photo-realistic.

IWF chief executive Susie Hargreaves said: ‘We will be watching closely to see how industry, regulators and government respond to the threat, to ensure that the suffering of Olivia, and children like her, is not exacerbated, reimagined and recreated using AI tools.’

if(window.adverts) { adverts.addToArray({"pos": "mpu_mobile_mid"}) } if(window.adverts) { adverts.addToArray({"pos": "mpu_tablet_mid"}) }

Offenders on the dark web forum investigated by the IWF openly discussed and shared advice on how to use generative AI technology to develop child sexual abuse imagery.

Step-by-step direction is given for offenders to make their own ‘child porn’ and requests are made for fine-tuned CSAM models of particular, named victims or celebrities.

Deborah Denis, CEO of Lucy Faithfull Foundation, said: ‘Adults viewing and sharing sexual images of children is a major problem and one that AI is making worse. AI and its capabilities are rapidly evolving and there is an unacceptable lack of safeguards within the technology which allows online child sex offenders to exploit it every day. It’s vital that tech companies and politicians do more to address these dangers as a matter of urgency.

‘Our research shows there are serious knowledge gaps amongst the public regarding AI – specifically its ability to cause harm to children. The reality is that people are using this new, and unregulated, technology to create some of the worst sexual images of children, as well as so-called ’nudified’ images of real children, including children who have been abused.

‘People must know that AI is not an emerging threat – it’s here, now. We need the public to be absolutely clear that making and viewing sexual images of under-18s, whether AI-generated or not, is illegal and causes very serious harm to real children across the world.’

Get in touch with our news team by emailing us at webnews@metro.co.uk.

if(window.adverts) { adverts.addToArray({"pos": "mpu_mobile"}) }

For more stories like this, check our news page.

if(window.adverts) { adverts.addToArray({"pos": "mpu_mobile_lower"}) }

Sign Up for News UpdatesGet your need-to-know latest news, feel-good stories, analysis and moreSign up

Privacy Policy

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

AI-generated images of child sexual abuse uses real victims as reference material

AI-generated images of child sexual abuse uses real victims as reference material

AI-generated images of child sexual abuse uses real victims as reference material

AI-generated images of child sexual abuse uses real victims as reference material
AI-generated images of child sexual abuse uses real victims as reference material
Ads Links by Easy Branches
Play online games for free at games.easybranches.com
Guest Post Services www.easybranches.com/contribute