2023-06-08 20:05:00
AFP, published on Thursday, June 08, 2023 at 10:05 p.m.
Instagram, a subsidiary of Meta, is the main platform used by pedophile networks to promote and sell content showing sexual assault on minors, according to a report by Stanford University and the Wall Street Journal (WSJ).
“Large networks of accounts, which give the appearance of being operated by minors, openly promote the sale of” child pornography content, researchers from the Cyber Policy Center at the prestigious University of Silicon Valley said on Wednesday.
“Instagram is currently the most important platform for these networks with features like content recommendation algorithms and messaging that helps sellers connect with buyers,” they added.
And neither pedophiles nor these networks need to show much ingenuity.
According to the WSJ, a simple search for keywords such as #pedowhore (“fucking pedo”) or #preteensex (“pre-teen sex”) leads to accounts that use these terms to advertise content showing sexual assaults on minors.
Often, these profiles “claim to be driven by the children themselves and use overtly sexual pseudonyms with words like + little slut for you +”, details the article.
The accounts don’t directly say they’re selling these images, but they do feature menus with options, including asking for specific sex acts, in some cases.
Stanford researchers also spotted offers for videos with bestiality and self-harm. “At a certain price, children are available for in-person ‘meetings’,” the article continues.
The report highlights the role played by the popular social network’s algorithms: a test account created by the business daily was “inundated with content that sexualizes children” following clicking on a few such recommendations.
– “Work group” –
“The exploitation of children is a horrifying crime,” said Meta, contacted by AFP. “We are working hard to fight this scourge and rid our platforms of it, and to help law enforcement arrest and prosecute the criminals responsible.”
The social media giant says it has created an “internal task force” to investigate the report’s findings and respond accordingly.
He also recalls many measures already in place, such as the recruitment of specialists, to fight once morest “predators who constantly change tactics”.
“Between 2020 and 2022, dedicated teams took down 27 networks that were committing abuse and in January 2023 we disabled more than 490,000 accounts that violated our child safety rules,” said a spokesperson.
He added that technical issues with reporting problematic profiles and content have been fixed and “thousands of additional terms and keywords have been restricted on Instagram.”
In all, by the end of 2022, more than 34 million content relating to the sexual exploitation of minors had been removed from Facebook and Instagram, of which more than 98% were detected before being reported by users, according to Meta.
Last March, pension and investment funds filed a complaint once morest the company for having “turned a blind eye” to human trafficking and pedocrime on its platforms.
Instagram is also regularly accused by associations and authorities of not sufficiently protecting children once morest the risks of harassment, addiction and self-image problems.
1686281517
#Instagrams #algorithms #selling #child #pornography #easier #researchers