Instagram’s algorithms connect a vast network of pedophiles, an investigation by The Wall Street Journal, Stanford University, and the University of Massachusetts Amherst discovered.
An investigaton by the Stanford Internet Observatory’s chief technologist, David Thiel, research manager Renée DiResta, and director Alex Stamos discovered that a massive network of English-speaking social media accounts buy, sell, and share child sexual abuse material (CSAM) across several social media networks.
Some child pornography is created by children or teenagers trying to earn money, impress a date, or appease a blackmailer. The report calls this content self-generated child sexual abuse material (SG-CSAM).
The pedophile accounts navigate these accounts through hashtags referencing pedophilia, the Stanford report reads. Some of these references are obvious, like hashtags that use variations of words like “pedo.”
Other references use emoji, mentions of pedophile code words, or other sly references, the report added.
Social media algorithms effectively served as the pimps that connected pedophiles to children, the report’s findings suggest. Their recommendations and suggestions pushed pedophilic content into the hands of those who desired it.
Although Instagram is the most important platform for pedophile networks, other sites, including Twitter, Telegram, Discord, Snapchat, TikTok, Mastodon, and others also connect pedophiles to their victims and each other, the report states.
Social media algorithms have created a system wherein thousands of pedophiles buy pornographic images, pornographic videos, online video meetings, and even in-person meetings with children and teens, the report concludes.
“In recent years, the creation and distribution of SG-CSAM has increasingly become a commercial venture. This commercialization often replicates the pattern of legitimate independent adult content production,” the report says.
How It Works
According to the report, the social media accounts where children produce porn resembles porn sites like OnlyFans. Victims create “menus” of sex acts that customers can ask for. Customers pay for the sex acts through gift-card swapping, exchange platforms like G2G and other websites that allow anonymous payment, the report concludes.
On social media, many of the “seller” accounts tend to claim ages between 13 and 17 years old, the report notes. But often, these children sell sexual pictures of themselves from far younger ages, it adds.
“Menus” can include sexually explicit videos, sexually explicit pictures, videos of self-harm, videos of sex acts with animals, videos of self-harm, video from when children were younger, and paid sexual acts, the report says.
Sellers use simple tricks to outsmart social media algorithms, the report notes. They reverse ages like “16” to “61,” partially blot out words about sexual content, and use code words like “menu,” the report adds.
The report estimates that there are between 500 to 1,000 seller accounts and thousands of followers.
Helpful Algorithms
Pedophiles and sellers didn’t have to look hard to find each other because algorithms did the work for them, the report suggests.
Most keywords that involve child porn return results on Instagram’s website, the report said.
For some terms, Instagram displays a warning message, the report reads.
“These results may contain images of child sexual abuse,” the message reads. “Child sexual abuse or viewing sexual imagery of children can lead to imprisonment and other severe consequences. This abuse causes extreme harm to children and searching and viewing such materials adds to that harm. To get confidential help or learn how to report any content as inappropriate, visit our Health Center.”
At the bottom, Instagram’s message offers users two options: “Get resources” is one. “See results anyway” is the other.
“Due to the widespread use of hashtags, relatively long life of seller accounts and, especially, the effective recommendation algorithm, Instagram serves as the key discovery mechanism for this specific community of buyers and sellers,” the report states.
Meta owns both Instagram and Facebook.
Companies Respond
A Meta representative told The Epoch Times that the company works “aggressively” to fight child porn on all its platforms.
“Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps, and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks,” the representative said.
From 2020 to 2022, Meta’s teams dismantled 27 abusive networks, the Meta representative said. In January 2023, Meta disabled over 490,000 accounts that violated child safety policies, the representative added.
“We fixed a technical issue that unexpectedly prevented certain user reports from reaching content reviewers, we provided updated guidance to our content reviewers to more easily identify and remove predatory accounts, and we restricted thousands of additional search terms and hashtags on Instagram. We’re committed to continuing our work to protect teens, obstruct criminals, and support law enforcement in bringing them to justice,” said the representative.
Twitter also has children creating child porn on its platform, the report adds. Although Twitter does better at taking down child porn, people who view some seller accounts still get recommendations for other seller accounts, the report notes.
Furthermore, Twitter allows nudity, which makes it easier for child porn to appear on the platform, the report says. Sometimes, accounts posted multiple images known to be child porn before Twitter removed them, the report notes.
Since learning about this problem, Twitter has largely fixed the issue, the report says.
When contacted, Twitter’s press line auto-replied with a poop emoji.
Commercialized child porn groups on communications apps Discord and Telegram had thousands of members, the report noted.
According to the report, Telegram’s official policies don’t address child porn in private chats, sexualization of children, and grooming. Discord’s policies don’t address the sexualization of children, the report adds.
Telegram spokesman Remi Vaughn told The Epoch Times that the site has worked hard to moderate child abuse content on the public parts of the app.
“In the case of child abuse content, we publish daily reports about our efforts at t.me/stopca. At time of writing, nearly 10,000 groups, channels and bots have been removed this month,” Vaughn said.
“These Telegram and Discord groups had hundreds or thousands of users; some appeared to be managed by individual sellers, though there were also multi-seller groups,” the report added.
The Epoch Times reached out to Discord, but received no comment.
On Snapchat, pedophiles can connect directly to children through peer-to-peer communication features, the report notes.
The Epoch Times reached out to Snapchat, but received no comment.
TikTok did a better job of limiting pedophile content, the report noted. But even there, problems existed, the report said.
“The fact that TikTok is far more oriented around content recommendations instead of hashtag-based discovery or friend recommendations also makes it harder for users to discover specific types of material intentionally,” the report reads.
Mastodon and Facebook also had less of a role in pedophile networks. Mastodon lacks direct messages for user communication, and Facebook has a “real name” policy that would stop pedophiles from going anonymous, the report notes.
Stop Abusers or Stop Business
According to Jon Uhler, a psychologist with 15 years of experience with over 4,000 sexual predators, this story shows how creatively predators pursue children. “Their dark creativity knows no bounds,” he said.
He added that society ignores how sexually explicit content leads to sexual predation by lowering children’s defenses.
“Anybody dealing with child development understands if you introduce highly sexualized content that is above their ability developmental level to process and understand, then you set them up to be easy prey,” Uhler said. “Because they don’t have the intuitive sense of evil intent.”
Furthermore, men become predatory by watching large amounts of porn, Uhler said.
“Deviance starts with lust and then objectification. And then it gets into power and control and degradation, and eventually the desire to have a negative effect,” Uhler said.
He added that sexual deviance works differently in men than women. Men are far more vulnerable to going down this path, he said.
“The female sex offenders and the male sex offenders are different by nature, in terms of the nature of their offense,” said Uhler. “You will never see any female who has been arrested for a sex offense that used objects on her victims.”
Uhler said that social media companies should make stopping predators their priority. He also said the companies are smart enough to stop predators.
“You guys are capable, really capable. If you are not dedicated to the protection of children, then close your site down,” he said to social media companies. “If you’re going to build one of these things, you know predators are coming.”