Instagram helps connect and promote network of pedophile accounts

Instagram helps connect and promote network of pedophile accounts and enables people to search for explicit hashtags, new research shows

  • An investigation found that Instagram guides those who express interest in child pornography to sellers
  • Hashtags like #pedobait and variations of #mnsfw (meaning ‘minor not safe for work’) had been used to tag thousands of posts
  • Other social media platforms did not seem to have that same problem 

Instagram is helping pedophiles connect with one another and buy child pornography, a damning new report suggests.

An investigation conducted by the Wall Street Journal, and researchers at Stanford University and University of Massachusetts Amherst found that the popular social media site guides those who express interest in child pornography to sellers.

Some of the accounts claimed to be run by the children themselves, using overtly sexual handles incorporating the words ‘little slut for you.’

Several of these sellers even let users commission their own works featuring children of different ages engaged in specific sexual acts, while others listed ‘menu’ prices for videos of children harming themselves and ‘imagery of the minor performing sexual acts with animals,’ researchers at the Stanford Internet Observatory found.

At the right price, sellers would even allow users to ‘meet-up’ with the children in question.

Meta, the parent company of Instagram, said it is now setting up an internal task force to address the issues while acknowledging its failures at enforcing its rules violating its policies.

A damning new report suggests Instagram is helping pedophiles connect with one another and buy child pornography,

The Stanford researchers set up test accounts that would view one single account that used terms suggesting it was involved in child sex abuse, and were immediately recommended to purported child-sex content sellers and buyers, as well as accounts linking to off-platform trading sites.

Other accounts in the network would aggregate pro-pedophile memes or discuss their access to children.

Researchers also used hashtags associated with underage sex, like #pedowhore and #preteensex, to find 405 sellers of what they called ‘self-generated’ child sex materials or accounts apparently run by children themselves — some of which said they were as young as 12.

According to data gathered by Maltego, a network mapping software, 112 of these sellers collectively had 22,000 unique followers. 

Current and former Meta employees who worked on Instagram’s child-safety initiatives now estimate that the number of accounts that exist solely to follow child pornography sellers is in the hundreds of thousands, if not in the millions.

‘Instagram is an on-ramp to places on the Internet where there’s more explicit child sexual abuse,’ Brian Levine, the director of the University of Massachusetts Rescue Lab, which researches online child victimization and builds tools to combat it told the Journal.

Other social media platforms did not seem to have that same problem, the Stanford team noted in its report Tuesday.

It said it found 128 accounts offering to sell child-sex abuse material on Twitter — less than one-third of what they found on Instagram.

Twitter also did not recommend such sellers to the same degree as Instagram, and took them down quicker.

TikTok’s platform, meanwhile, is one where ‘this type of content does not appear to proliferate’ and Snapchat is primarily used as a messaging system.

But executives at Instagram have said its internal statistics show that users see child exploitation in less than one in 10,000 posts.

The company uses a database from the National Center for Missing and Exploited Children to detect known child sex abuse images and videos.

In 2022, Meta accounted for 85 percent of the 31.9million reports of child pornography the center received — including some 5million found on Instagram alone.

Instagram is owned by the parent company Meta, which said it is working to fix the issue. CEO Mark Zuckerberg is pictured here in 2018

But the company cannot detect new child pornography images or videos on its sites, or efforts to advertise their sale.

It also struggles more than other social media companies to prevent pedophile networks from spreading due to its weak enforcement and design features that promote content discovery, according to experts.

‘Instagram’s problem comes down to content-discovery features, the way topics are recommended and how much the platform relies on search and links between accounts,’ said David Thiel, the chief technologist at Stanford’s Internet Observatory.

‘You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t.’

He noted that hashtags are a central part of content discovery on Instagram, and #pedobait and variations of #mnsfw (meaning ‘minor not safe for work’) had been used to tag thousands of posts dedicated to advertising sex content featuring children — making them easily accessible to prospective buyers.

And in many cases, Instagram allowed users to search for terms that its algorithms knew may be associated with illicit and illegal material.

In those cases, a pop-up screen would appear, warning users that ‘These results may contain images of child sexual abuse’ and said that the production and consumption of such materials causes ‘extreme harm’ to children.

But the screen would offer users two choices: ‘Get resources’ or ‘See results anyway.’

Instagram said it has now removed to see results, but declined to say why it even offered the option in the first place. 

A pop-up message would warn users that ‘These results may contain images of child sexual abuse’ and said that the production and consumption of such materials causes ‘extreme harm’ to children but allowed them to ‘See results anyway’

On Instagram, child pornography accounts brazenly identify themselves as ‘seller’ or ‘s3ller,’ with many stating their preferred form of payment in their bios.

The sellers would often convey the child’s purported age by saying they are on ‘chapter 14’ or ‘age 31’ followed by a reverse arrow emoji.

Some would also show signs that they were engaged in child sex trafficking, such as one displaying a teen with the word ‘W****’ written across her face.

And many of the accounts show users with cutting scars on the inside of their arms or on their thighs, and a number mention past sexual abuse.

When users would try to report these accounts, the Journal reports, Instagram often would not take action.

Sarah Adams, a Canadian mother of two, told how she built an Instagram following discussing child exploitation and the dangers of social media.

Her followers would sometimes send her some of the disturbing things they find on the site, she said, and in February, one messaged her an account branded with the term ‘incest toddlers.’

Adams said she accessed the account — which primarily posts pro-incest memes and has more than 10,000 followers — just long enough to report it to Instagram.

But soon her followers noticed that when they viewed her profile, they were being recommended to the ‘incest toddlers’ page.

Earlier this year, an unnamed anti-pedophilia activist also found an account claiming to belong to a young girl selling underage sex content, with one post declaring: ‘This teen is ready for you pervs.’

After reporting the account, he said he received a message from Instagram reading: ‘Because of the high volume of reports we receive, our team hasn’t been able to review this post.’

He then decided to report another post on the account, this time showing a scantily-clad young girl with a sexually graphic caption. But Instagram responded by saying: ‘Our review team has found that [the account’s] post does not go against our Community Guidelines.’

In a statement to the Journal, a spokesperson for Meta said they reviewed how the company handled reports of child sex abuse, and found that a software glitch was preventing a large number of user reports from being processed, and the company’s moderation staff was not properly enforcing the rules.

The company said it has since fixed the bug, and is providing new training to its content moderators.

‘Child exploitation is a horrific crime,’ a spokesperson for Meta said in the statement, adding ‘We’re continuously investigating ways to actively defend against this behavior.’

The spokesperson also noted that over the past two years, the company has taken down 27 pedophile networks and is in the process of taking down more.

And, the spokesperson said, it has blocked thousands of hashtags that sexualize minors and restricted its system from recommending users search for terms known to be associated with sex abuse.

Instagram allows users to create more than one account, enabling them to skirt enforcement

Despite these efforts, Stanford researchers found Instagram easily allows for pedophiles to evade enforcement by simply creating another account and listing the handle of their ‘back-up account’ on their profiles.

They also found that after the company decided to crack down on links from a specific encrypted file-transfer service known for transmitting child-sex content, Instagram blocked searches for its name.

But the company’s autofill feature continued to recommend that users try variations of the name, including those with the word ‘boys’ and ‘CP’ (short for ‘child pornography’) added to the end. 

Source: Read Full Article