Anyone with access to the internet can produce convincing, lifelike images of almost anything using easily available apps and software. These ‘generative’ artificial intelligence tools can be downloaded or accessed online and used to turn a simple text prompt into new written, image and video content in a matter of moments.
Although AI tools carry benefits and can be used positively in daily life, they can also be used for harmful and illegal purposes, such as the creation of sexual imagery of children.
What is AI generated child sexual abuse material?
Artificially generated child sexual abuse material describes images of child sexual abuse that are partially or entirely computer-generated. They are usually produced using software which converts a text description into an image. This technology is developing rapidly, the images created can now be very realistic, and recent examples are difficult to differentiate from unaltered photographs.
Many popular, publicly available artificial intelligence tools automatically block attempts to create abusive material, but the large number of child sexual abuse images made using them that have been detected show that individuals have found ways around this.
Some artificially generated child sexual abuse material depicts children who appear to be entirely artificial or fictional, and it is often assumed this means no child has been harmed or affected. But most tools rely on thousands of existing images to inform or ‘train’ them, so genuine images of individuals are likely to have been used as reference material.
There are tools and methods that allow users to deliberately make images or videos that feature the likenesses of real people. This can allow someone to create sexual abuse images and videos with children and young people as the subjects without being present or having their knowledge or consent. Examples of this include the use of ‘deepfake’ tools, which can edit a child into existing explicit or abusive material, or apps which digitally remove the clothes of subjects in photographs.
Legality
It is llegal to create, view and share all sexual images of children under-18 produced using artificial intelligence. It is important to remember this applies to all material depicting child sexual abuse – it doesn’t matter if the material is created in a ‘conventional’ way using a camera, or created using artificial intelligence tools.
How to respond
The skills professionals should use to respond are the same as in any case of child sexual abuse. Although you may not be certain about how images have been created, or worry that you don’t have specialist knowledge in responding to harm in this context, this shouldn’t change the overall protection and support response that should be followed for any concern about child sexual abuse.
Responding to children captured in AI generated CSA material
Your response may be different depending on the individual circumstances a child is in. But in all cases, you should refer concerns to children’s social care and/or the police if you’re concerned that a child or young person has been or is at risk of immediate harm at any point.
The impact on victims of the circulation of images of sexual abuse of children can be as severe and varied as the abuse itself. In cases where the images are partially or entirely artificially generated, it’s important for professionals to be there to listen to children and young people, ask them what happened, believe them and tell them how you can help.
Support for children and young people should always respond to their own individual situation and needs, which you can do regardless of the technology used to harm them.
Responding to adults who have created, viewed or shared AI generated CSA material
Professionals should build their understanding of who has accessed this material and why, the risk they may pose to the children around them, and consider what safety plans should be put in place to keep everyone safe from harm.
Even where you suspect or an adult has told you the images were created using artificial intelligence, it is still important to undertake an assessment of risk and, if this has not happened already, report them to the police
See the Centre of Expertise's guide on managing risk and trauma after online sexual offending for more information: https://www.csacentre.org.uk/research-resources/practice-resources/managing-risk-and-trauma-after-online-sexual-offending/
Responding to children who may have viewed or created artificially generated child sexual abuse material
Apply the same considerations as you would in other cases of harmful sexual behaviour.
First, refer them to children’s social care and/or the police if you’re concerned that a child has been or is at risk of immediate harm, and consider the impact on each child involved.
Many of the young people you will work with in these contexts will, with the right intervention at the right time, be unlikely to present a risk as adults. You may also meet some young people whose behaviour is particularly worrying, which may indicate they have experienced harm or abuse of some form themselves. Their needs must be addressed in your professional response too, including where relevant with the family.
NSPCC (2024) Young people’s experiences of online sexual extortion or ‘sextortion’. London: NSPCC.
‘Sextortion' is a type of online blackmail. It's when criminals threaten to share sexual pictures, videos, or information about you unless you pay money or do something else you don’t want to.
Anyone can be a victim of sextortion. However, young people aged between 15 to 17, and adults aged under 30, are often most at risk.
Criminals often target people through dating apps, social media, webcams, or pornography sites. They may use a fake identity to befriend you online. If a person you’ve just met online chats to someone in a sexual way, or asks for sexual images, it might be an attempt at sextortion.
You should be wary if someone a young person has met online:
is trying to start a relationship with them very quickly (they may even send them a sexual image first)
chats to them in a sexual way, or asks for sexual images, soon after they've met them.
has sent friend requests to lots of people, not just the young person
repeatedly asks them to do sexual things that they're not comfortable with
tells them they’ve hacked their account or have access to their contacts
Sextortion attempts can happen very quickly, or they can happen over a long time. Sexual images or personal information should not be shared if the young person feels uncomfortable. Young people can still be a victim of sextortion if they haven't shared sexual images or information. Criminals may have hacked one of their accounts, or created edited or fake images or videos, like deepfakes, of them that appear real.
Even if blackmail isn't involved, sharing or threatening to share intimate photos or videos of someone without their permission is illegal. This is called 'revenge porn' or intimate image abuse.
You should support the young person to stop engaging with the individual if they feel uncomfortable, or if someone contacts them online who they don't know.
You could also help the young person to review their privacy settings. Criminals are less likely to target them if they can’t see who their friends and family are.
If you are working with a victim of sextortion, or if you are worried they are being targeted, it's never their fault, they are not to blame and have done nothing wrong.
Advise the young person to stop all communication with the offender immediately. They should also never pay the perpetrator, as there is no guarantee that this will stop the threats.
If the young person wants to get their account removed, deactivate it rather than delete it. This will make it easier for the police to find evidence.
The young person does not have to gather 'evidence' like screenshots, text messages, videos or photos before reporting to the police. All they need to do is to tell the police what happened.
If the young person has any screenshots, text messages, videos or photos, details like these can be useful to the police. It is also helpful if they can put these details in a timeline.
Even if the photo or video is no longer available, it should still be reported by the police as they may be able to recover the image.
Useful information if the young person has it:
sender's name
contact details, including email addresses and phone numbers
usernames
bank account details
user IDs
which platform, app, social media site, or online space they were using
when it happened
details of what was sent, written, or spoken in a voice note or video
You should only screenshot an intimate image if you are sure the person is over 18. If you are unsure, don’t screenshot, record, or share it with anyone, even the police. It could be a criminal offence.
You can report these crimes online.
Young people under 18, can report sextortion, or any other form of online child sexual abuse, to the National Crime Agency’s Child Exploitation and Online Protection (CEOP) Safety Centre.
The report will be sent direct to the police's control room where it will be reviewed by the same team who answer calls.
Visit a police station
If the young person wants to speak to an officer in person, the police will provide a safe and comfortable environment at any police station.
Childline can help young people to get shared images removed from the internet.
Resources for children and young people
Report Remove allows you to confidentially report sexual images and videos of yourself and get them removed from the internet.
Childline gives free, confidential support for young people under the age of 19.
Young Minds is a mental health charity for children and young people with 24 hours a day, seven days a week text support.
Child Exploitation and Online Protection Command (CEOP) Education gives advice for parents, children, and young people on staying safe from sexual abuse and online grooming.
Young people’s experiences of online sexual extortion or ‘sextortion’
This briefing shares children and young people’s experiences of so-called ‘sextortion’, a form of online blackmail that involves the threat of sharing nude or semi-nude images or videos to extort money or force someone to do something against their will.
Available at: https://learning.nspcc.org.uk/media/ylbobz5i/young-people-experiences-sextortion.pdf
Adults may use online spaces to have sexual conversations with children; view, download or distribute sexual images of children; order someone to perform sexual abuse on a child in front of a webcam; communicate with a child with the intention of performing an offence in person later on; or incite a child to pose naked or perform sexual acts via photo, video or live webcam. Such abuse can occur both on the ‘dark web’ and, more commonly, on ‘open web’ platforms such as Snapchat, Instagram, Facebook, Facebook Messenger, X (formerly Twitter) and WhatsApp.
The dark web is a side of the internet that is not accessible by search engines and requires specific configuration, software, or authorisation to access allowing users and website operators to remain anonymous or untraceable.
A study of individuals searching for child sexual abuse imagery on the dark web found that the self-reported likelihood of seeking direct contact with children after viewing that imagery was higher among those who viewed the imagery more frequently, were older when they started viewing it, viewed imagery depicting toddlers and infants, and/or had been in contact with other viewers of child sexual abuse imagery (Insoll et al, 2022).
The Internet Watch Foundation (IWF) has identified a significant and growing threat where AI technology is being exploited to produce child sexual abuse material (CSAM). On a single dark web forum, the IWF identified 3,152 AI CSAM images in a one-month period.
The Internet Watch Foundation identified an anonymous webpage that contained links to optimised AI models that featured the likeness of 128 different named victims of CSA.
Forums exist on the dark web where people discuss and share real child sexual abuse material.
“Wanted to know if someone could point me in the right direction to learn, download the software, etc.”
They continue to discuss how to train AI models for named victims or celebrities and share models they have trained and images they have generated, and request new ones.
For more information:
Internet Watch Foundation: What has changed in the AI child sexual abuse material landscape? Available at: https://www.iwf.org.uk/media/nadlcb1z/iwf-ai-csam-report_update-public-jul24v13.pdf
Centre of expertise: Key messages from reasearch on child sexual abuse by adults in online contexts. Available at: https://www.csacentre.org.uk/app/uploads/2023/10/Key-messages-from-research-on-child-sexual-abuse-by-adults-in-online-contexts-ENGLISH.pdf