REALITY TV
TV
MOVIES
MUSIC
CELEBRITY
About Us Contact Us Privacy Policy Terms of Use Accuracy & Fairness Corrections & Clarifications Ethics Code Your Ad Choices
© MEAWW All rights reserved
MEAWW.COM / NEWS / HUMAN INTEREST

Instagram under fire for allowing accounts that act as 'shop windows' for pedophiles

Despite claims of a zero-tolerance approach to child exploitation, the dodgy accounts are seen as 'acceptable' by Instagram's automated technology
PUBLISHED APR 18, 2022
If anyone reports the questionable content, Instagram sends an automated response stating their technology did not find it against guidelines (Justin Sullivan/Getty Images)
If anyone reports the questionable content, Instagram sends an automated response stating their technology did not find it against guidelines (Justin Sullivan/Getty Images)

Instagram is in hot water after failing to remove accounts that are a source of sexualized content about children. These accounts constantly post pictures of children in swimwear or partial clothing. Even though Instagram users have attempted to flag the inappropriate content many times via Instagram's in-app reporting tool, the content still remains on the micro-blogging platform.

Meta, Instagram's parent company claims that it follows a zero-tolerance approach to child exploitation. But despite the claims, accounts that have been reported by users are still seen as 'acceptable' by the platform's automated technology. One such account that posts photos of children in sexualized poses was reported by a researcher, but Instagram merely sent an automated response stating that due to a high volume of reports, the content was not reviewed. The message stated that Instagram's technology did not find the account going against community guidelines.

READ MORE

'Double standards': Instagram slammed for allowing Kendall Jenner to post NUDES but banning others

Mothers share pics of their post-partum tummies after blogger's picture of her scars removed from Instagram



 

The online safety bill intended to regulate social media firms that will be debated in parliament on April 19 states, "Self-regulation of online services has failed. Whilst the online world has revolutionized our lives and created benefits, underlying systems designed to service business models based on data harvesting and microtargeted advertising shape the way we experience it. Algorithms that are invisible to the public, decide what we see, hear and experience. The Online Safety Bill is a key step forward for democratic societies to bring accountability and responsibility to the internet."

"Protecting children is a key objective of the draft bill and our report. Our children have grown up with the internet and it can bring them many benefits. Too often, though, services are not designed with them in mind. We want all online services likely to be accessed by children to take proportionate steps to protect them. Extreme pornography is particularly prevalent online and far too many children encounter it—often unwittingly," it reads.



 

Andy Burrows, head of online safety policy at the NSPCC, said the accounts acted as a "shop window" for pedophiles. He says, "Companies should be proactively identifying this content and then removing it themselves. But even when it is reported to them, they are judging that it's not a threat to children and should remain on the site," as per The Guardian. Lyn Swanson Kennedy of Collective Shout said the platforms were relying on external organizations to do their content moderation for them. "We are calling on platforms to address these very concerning activities which put underage girls, particularly at serious risk of harassment, exploitation, and sexualization," she said.

Imran Ahmed, executive of the Center for Countering Digital Hate, stated, "Relying on automated detection, which we know cannot keep up with simple hate speech, let alone cunning, determined, child sex exploitation rings, is an abrogation of the fundamental duty to protect children." Meanwhile, Professor Sonia Livingstone, from the Department of Media and Communications at LSE, described including pornography on sites that do not host user-to-user content and are therefore not covered by the draft bill as the "number one concern of children, and indeed many adults."



 

A spokesperson for Meta said it had strict rules against content that sexually exploits or endangers children and added, "We're also focused on preventing harm by banning suspicious profiles, restricting adults from messaging children they're not connected with, and defaulting under-18s to private accounts."

RELATED TOPICS INSTAGRAM NEWS
POPULAR ON MEAWW
MORE ON MEAWW