Zuckerberg's 'callous' response to child sex abuse material on FB infuriates shareholders: 'We can't fire him'
Mark Zuckerberg is more concerned about enhancing the privacy of Facebook and other social media users than help curb the growth of child sex abuse material (CSAM) on the networking platform. This has infuriating other shareholders of the company, but the shareholding pattern of the company has rendered them helpless.
Michael Passoff, founder and CEO of Proxy Impact, a shareholder advocacy service, told MEA WorldWorld that although Facebook has currently become the world’s number one hub of reported CSAM, with the social media giant accounting for 94% of such reported content in 2019 (15.8 million cases) alone, Zuckerberg, who controls around 60 percent of voting shares in the company founded by him, has put little thought or effort to bring down the alarming statistics.
In a recent shareholders meeting, a proposal that urged the company to "assess the risk of increased child sexual exploitation from the company's plans for end-to-end encryption" was shot down, Passoff said. And this was mainly due to the fact that Zuckerberg was the "Chairman and CEO so there is no independent board oversight of his performance."
"Facebook has duel stock classes and Zuckerberg's Class B stock is worth 10 votes per share while everyone else's shares are one share one vote. This gives him control of 57% of the shares which means that he can never lose a shareholder vote. The result is that neither the board or shareholders can override his decisions or fire him," Passoff said.
He added: "For example, our shareholder proposal got 12.6% of the vote. I have been filing shareholder resolutions for 25 years and 12% is usually enough to prompt most companies to take some action on the shareholder's area of concern. But in this case our 12.6% is actually equal to 43% of votes not controlled by Zuckerberg and it is very rare for a company to ignore 43% of shareholders... yet the company has failed to even speak with the resolution co-filers despite our requests for dialogue."
Meanwhile, Zuckerberg remains fixed on the idea of complete privacy being the future of social media, despite the legitimate concern that Facebook’s rush for end-to-end encryption in Messenger and Instagram "seems to ignore the overwhelming threat to children’s privacy and safety" and the fact that it "will provide child predators cover that will exponentially expand their outreach and the number of victims."
Despite being aware of the risks to child safety inherent in his bid to increase privacy on Facebook and other related social media platforms, Zuckerberg's response to address the same has been "callous" at best. “Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion," the tech mogul wrote in a March 2019 blog.
While answering an employee's concern regarding the same issue last year in October, he acknowledged that "losing access to the content of messages would mean 'you’re fighting that battle with at least a hand tied behind your back'" when it comes to child abuse. The lackluster interest in fighting child predators who are using his platform to prey on children has "only fueled the criticism of Zuckerberg further and led many to believe that nothing will change as long as he has complete control."
Passoff said that the increase in reported CSAM on Facebook as well as the other platforms under the company was directly proportionate to the growth of the popularity of the applications. "Facebook’s other platforms include WhatsApp with 2 billion users, Facebook Messenger with 1.3 billion users, and Instagram topping 1 billion users. These four social media platforms alone account for nearly half of the world’s monthly social media use... Facebook comprises the bulk of reported images because it scans more actively for them than any other company," he said.
But even then, the social media platform has largely been inefficient in countering the growing problem. "Facebook has not seemed able to keep pace with the sheer volume of content on its various platforms to be scrutinized. Facebook searches rely on artificial intelligence that generally detects previously identified images but has trouble detecting new images, videos and live streaming. Human confirmation is typically needed and most of Facebook’s reports go to non-profit groups such as the National Center for Missing and Exploited Children that are overwhelmed with the avalanche of material they are sent," Passoff said.
And with the world landing in the midst of a global pandemic, the issue has only been aggravated. With most of the company's content moderation workforce — most of which is outsourced — told not to continue during the stay-at-home orders around the world, CSAM went undetected on the platform.
Added to that was the fact of schools in over 180 countries sending children home to continue their education on Internet-connected devices which has led to kids "flocking to social media to connect with friends and strangers during social isolation. As parents struggle with juggling work or lack thereof and risks and stress from the pandemic, it left many children unsupervised online. Many of those children use WhatsApp and Facebook Messenger to connect, likely increasing the chances that Facebook’s many platforms are facilitating the connections between child pedophiles and unsuspecting children."
As a result, Passoff stressed that it is more important than ever for Zuckerberg to take accountability and focus on making social media safe for children. "Governments, law enforcement agencies and child protection organizations have harshly criticized Facebook’s planned encryption claiming that it will cloak the actions of child predators and make children more vulnerable to sexual abuse. Pending legislation in Congress could make Facebook legally liable for CSAM. The company is facing increasing regulatory, reputational and legal risk due to this issue," he said.