Last updated by Stephanie.Rosser@essex.gov.uk on Wednesday, October 30, 2024
The Internet Watch Foundation (IWF) has published a news story on the increase of artificial intelligence (AI) generated child sexual abuse material (CSAM) being found on publicly accessible areas of the internet. It discusses: the prevalence of AI-generated CSAM online; the difficulty for agencies in knowing if there is a real child being harmed; and the law relating to AI-generated CSAM.
In addition, Internet Matters has published a report on children’s experiences of nude deepfakes, AI-generated or manipulated images or videos made to look real. The report estimates that 13% of UK teenagers have had an experience with a nude deepfake, including sending or receiving one, encountering a nude deepfake online or using a 'nudifying' app. The report presents the findings of a survey with 2,000 parents and 1,000 children. The report recommends a ban on 'nudifying' tools and calls for reforms to the school curriculum so children are taught to use AI technology responsibly.