Instagram under fire for allowing accounts that act as 'shop windows' for pedophiles
Instagram is in hot water after failing to remove accounts that are a source of sexualized content about children. These accounts constantly post pictures of children in swimwear or partial clothing. Even though Instagram users have attempted to flag the inappropriate content many times via Instagram's in-app reporting tool, the content still remains on the micro-blogging platform.
Meta, Instagram's parent company claims that it follows a zero-tolerance approach to child exploitation. But despite the claims, accounts that have been reported by users are still seen as 'acceptable' by the platform's automated technology. One such account that posts photos of children in sexualized poses was reported by a researcher, but Instagram merely sent an automated response stating that due to a high volume of reports, the content was not reviewed. The message stated that Instagram's technology did not find the account going against community guidelines.
READ MORE
'Double standards': Instagram slammed for allowing Kendall Jenner to post NUDES but banning others
Whether you want to hide your likes or turn off comments completely, your Instagram experience is what you make of it. ❤️
— Instagram (@instagram) August 12, 2021
Learn more about our tools that help protect our community from abuse: https://t.co/09IrwsmFeh pic.twitter.com/0kj15DyNPJ
The online safety bill intended to regulate social media firms that will be debated in parliament on April 19 states, "Self-regulation of online services has failed. Whilst the online world has revolutionized our lives and created benefits, underlying systems designed to service business models based on data harvesting and microtargeted advertising shape the way we experience it. Algorithms that are invisible to the public, decide what we see, hear and experience. The Online Safety Bill is a key step forward for democratic societies to bring accountability and responsibility to the internet."
"Protecting children is a key objective of the draft bill and our report. Our children have grown up with the internet and it can bring them many benefits. Too often, though, services are not designed with them in mind. We want all online services likely to be accessed by children to take proportionate steps to protect them. Extreme pornography is particularly prevalent online and far too many children encounter it—often unwittingly," it reads.
We’re defaulting people under 16 (18 in some countries) to private when they sign up, and encouraging people already on Instagram to switch to private. People can still change to public - but we think private is the best choice for most young people.
— Instagram Comms (@InstagramComms) July 27, 2021
Andy Burrows, head of online safety policy at the NSPCC, said the accounts acted as a "shop window" for pedophiles. He says, "Companies should be proactively identifying this content and then removing it themselves. But even when it is reported to them, they are judging that it's not a threat to children and should remain on the site," as per The Guardian. Lyn Swanson Kennedy of Collective Shout said the platforms were relying on external organizations to do their content moderation for them. "We are calling on platforms to address these very concerning activities which put underage girls, particularly at serious risk of harassment, exploitation, and sexualization," she said.
Imran Ahmed, executive of the Center for Countering Digital Hate, stated, "Relying on automated detection, which we know cannot keep up with simple hate speech, let alone cunning, determined, child sex exploitation rings, is an abrogation of the fundamental duty to protect children." Meanwhile, Professor Sonia Livingstone, from the Department of Media and Communications at LSE, described including pornography on sites that do not host user-to-user content and are therefore not covered by the draft bill as the "number one concern of children, and indeed many adults."
And if you see a post you don’t like in Search and Explore, you can choose to see fewer posts like it 👀
— Instagram (@instagram) April 13, 2022
Tap the three dots above the post → tap “See Fewer Posts Like This”
A spokesperson for Meta said it had strict rules against content that sexually exploits or endangers children and added, "We're also focused on preventing harm by banning suspicious profiles, restricting adults from messaging children they're not connected with, and defaulting under-18s to private accounts."