Artificial Intelligence Child Sexual Abuse Images: Challenges for Justice in the Face of a Growing Trend

2024-08-27 03:30:00

These are just two cases, but the trend continues to grow and is already a conversion problem at a different level.. What happened in Roca, where a group of teenagers were reported to a Rio Negro court for using artificial intelligence (AI) to generate images of child sexual abuse using the faces of their classmates, triggered a reality: it is already here

On August 8, the news first spread in Roca, then spread throughout the province and even became national news. At least four teenagers used artificial intelligence to alter images of classmates and produce photos of child sexual abuse, classification applicable to these situations.
The incident created tensions at the school between the students themselves, parents and education authorities.

The Roca court is investigating the case in high secrecy because the people involved The creators of these images and the victims whose photos were tampered with are minors. One possibility being investigated is whether older relatives of the accused teenagers were also involved in the operation due to the type of editing program used.

Notable information is also reported in the case under investigation, namely whether these images were sold via Telegram and Whatsapp. Even the lawyer who filed the complaint for Rocca’s family, He said the money raised was used to purchase chips for the virtual casino.

In this context, Argentina has no regulations specifically governing artificial intelligence. On June 10, National Representative Juan Brügge introduced a bill in the House of Representatives entitled “Legal Regime Applicable for the Responsible Use of Artificial Intelligence in the Argentine Republic.” The initiative aims to establish a legal framework for the ethical use of artificial intelligence. The project has not yet been discussed.

Complaints about Roca teenagers using AI to create child sexual abuse photos: raided

According to judicial sources in the Supreme Court, currently “There are two cases in the judicial department of Rio Negro on this issue To date, the relevant professional identification conducted by forensic technicians from the judicial department has not been completed.

Metadata for the photos usedAs explained, is the key to trying to reveal the image of artificial intelligence changes.

Searching for ‘photography’s digital footprint’, experts report key to artificial intelligence modifying images

“Find the fingerprint of the photo”giving examples of sources consulted by the judicial branch of Rio Negro.
The Rionegro Judiciary has two specialist bodies that deal with these cases. On the one hand is the Information Technology and Telecommunications Office, which relies on the Prosecutor’s Office and is responsible for the traceability of communications and the extraction of mobile phone information.

On the other hand there are Forensic Computing Directorate, part of the Supreme CourtThe province is committed to developing computer and digital skills.

The two agencies complement each other in investigations conducted by the Department of Justice, and in this specific case of Rocca, they will be even more relevant because The complexity of image production and the people involved, and trace the photo trail between the phones.

The low number of cases solved in the country means AI-modified images are worthy of attention. Continuous consultations between forensic computer scientists from different branches of justice in Argentina.

Pictures: One of the defendants in the case is suspected of spreading child abuse images in Bariloche

Detection techniques used to find out how and who is using artificial intelligence on imagery are constantly improving as new cases emerge.

One topic discussed at the Forensic Computer Technicians meeting had two aspects: For example, when a minor’s face appears in an image of child sexual abuse. Justice seeks to protect violated minors.

But now there is another trend, which is Completely create a child sexual abuse image or scene from scratchwhich sparked a debate that was not forensic computer science, but legal.

IWF alert on number of photos of minors

It is these issues that have been addressed by the Internet Watch Foundation for two years. (IWF), an international organization fighting online child sexual abuse.

IWF warning usage Artificial intelligence manipulates images of minors around the worldresulting in a 360% increase in sexual photos of boys and girls aged 7 to 10 posted on the site.

Additionally, this risk is increased among women, Because 50% of images The IWF reported in 2022 that controlled by AI, Minors aged 11 to 13 were present.

Artificial intelligence child sex abuse videos become new focus for experts

Internet Watch Foundation (IWF) The “October 2023 Report” shows that there were more than 20,000 images generated by artificial intelligence in the “Darknet” forum within a monthmore than 3,000 of which show criminal activity involving child sexual abuse. since then, The problem has intensified and continues to grow“The organization warned.

“Artificial intelligence-generated images of child sexual abuse are advancing so rapidly that the IWF is now seeing The first real-world example of AI video This shows sexual abuse of children. International Weightlifting Federation warns.

“The first AI CSAM (child sexual abuse material) videos are already circulating. Most of them are partially synthesized videos (partly real, partly artificial), although there are also some fully synthetic (completely artificial) original videos,” the group warned.

IWF encounters an increasing amount of AI-generated content, including A.I. CSAM, on transparent or legitimate networks.

With image generation almost “solved”, resources are being invested in trying to solve the next frontier: video generation. “We can get a glimpse of the model’s future capabilities in the 2024 previews of OpenAI’s Sora and Google’s Veo,” they said of these companies’ future AI video generators.

Urgent need for child safety

Against this background, there is an urgent need to design child safety at all stages of model development and distribution, Among all players in the artificial intelligence ecosystem, the organization noted in its latest report.

AI CSAM videos discovered during the research to update this report may have been created using original open source tools, “They are the canary in the coal mine”, A warning signal formerly used in mines to warn of the presence of toxic gases.


1724736142
#Artificial #Intelligence #Child #Sexual #Abuse #Images #Challenges #Justice #Face #Growing #Trend

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.