How social media fell out of love with QAnon: YouTube follows Twitter and FB in cracking down on far-right movement

YouTube said it would prohibit content that targets individual or group with conspiracy theories. However, experts feel too little has been done too late


                            How social media fell out of love with QAnon: YouTube follows Twitter and FB in cracking down on far-right movement
(Getty Images)

YouTube on Thursday, October 15, became the latest social media platform to crack down on the QAnon conspiracy theory ahead of the November 3 election though it did not impose a full ban on the pro-Donald Trump movement that is spreading fast.

In a blog post issued on Thursday, the video platform said it would “prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-word violence,” citing QAnon and Pizzagate, a related conspiracy theory. YouTube also confirmed that it had removed “tens of thousands” of videos and “hundreds of channels” related to QAnon whose members are convinced that President Trump is under threat from a Satanic “deep state” cabal of Democrats and Hollywood celebrities involved in trafficking of children.

A woman holds up a QAnon sign to the media as attendees wait for President Donald Trump to speak at a campaign rally at Atlantic Aviation on September 22, 2020, in Moon Township, Pennsylvania (Getty Images)

 

Often, YouTube’s recommendation algorithms have been criticized for drawing users towards radical content and conspiracy theories. In 2018, it faced accusations of pushing the audience “down the rabbit hole” of content which is considered baseless, and in response, YouTube revised its systems to restrict the extent of harmful misinformation. The move comes in the wake of Facebook and Twitter having taken similar steps to eliminate QAnon theories from their own platforms in recent months. In July, social media giant Twitter also put a curb on thousands of QAnon-related accounts and said it would not recommend content linked to the movement. Facebook, on the other hand, announced plans to erase it from its platform soon after.

Even platforms like TikTok and Reddit have taken action against QAnon, Reuters reported. While a spokesperson for TikTok said QAnon content “frequently contains disinformation and hate speech” and that it has blocked dozens of its hashtags, a Reddit spokesperson said it has removed QAnon communities that have repeatedly violated its rules since 2018.

President Donald Trump (Getty Images)

What is QAnon?

The supporters of QAnon, a far-right conspiracy theory backing Trump, promote an interconnected series of beliefs — based on anonymous Web postings from “Q” — who claims to have inside knowledge of the current administration. A core belief of this theory is that Trump is secretly combating a cabal of child-sex predators who include among others, noted Democrats, Hollywood elites and “deep state” allies. QAnon is preceded by similar viral conspiracy theories like Pizzagate and it began prominently with a post in October 2017 on the anonymous imageboard 4chan by “Q” — presumably an American individual. It is now likely that “Q” has become a group of people. AQnon borrows some elements from the Pizzagate theory about a pedophile ring operated from a restaurant in Washington DC and emerged into a “big tent” conspiracy theory that includes misinformation about various topics like, alien landings and vaccine safety. Pizzagate, which went viral during the 2016 presidential election, has been debunked since then. Followers of QAnon, on the other hand, are of the opinion that a so-called Great Awakening is due to bring salvation.

How QAnon made rapid online spread?

The “Q” posts that started in 2017 are now posted on 8kun — a rebranded version of shuttered web board 8chan. QAnon has also been amplified on Twitter TWTR.N and Facebook FB.O, besides Instagram and YouTube. Reuters also reported that the Institute for Strategic Dialogue (ISD) found recently that a number of users who engage in discussions over QAnon on social media platforms like Twitter and Facebook have gone up this year and membership of QAnon groups on Facebook shooting up by 120 percent in March.

According to researchers, organizations backed by the Russian government are playing a small but key role in amplifying such theories. Supporters of QAnon even helped in organizing real-life demonstrations against child trafficking in August and also took part in pro-police protests in Portland, a city that has seen widespread violence over the last few months since the brutal death of George Floyd in police custody in Minneapolis, the report also added. It has also been said that QAnon could gain an entry in the House as well as at least one GOP candidate who is poised to win the November 3 election backs its beliefs. Media Matters, a Left-leaning nonprofit body, has reported that at least 27 congressional candidates who have endorsed QAnon or promoted content related to it.

On the issue of QAnon gaining a wide circulation on social media platforms, ISD researchers have claimed that about 20 percent of all QAnon-related Facebook posts featured YouTube links.

But can ban alone stop QAnon’s run?

Experts believe that too little has been done too late in curbing QAnon’s run. MIT Technology Review cited Brian Friedberg, a senior researcher at Harvard Shorenstein Center’s Technology and Social Change Project, in a July article as saying that after studying the far-right movement closely, he feels the mainstream social media platforms have made it “absolutely” too late to stop QAnon. He said they could have done more but did not do enough in the last three years.

“They’ve had three years of almost unfettered access outside of certain platforms to develop and expand,” Friedberg was quoted as saying. The MIT article also said that breaking QAnon’s trust would need breaking of trust between the anonymous “Q” and their followers. But given the fact that Q’s inaccurate predictions have not weakened that trust, the mission to break QAnon is difficult to accomplish. “...critical media coverage or deplatforming have yet to really do much on that front. If anything, they only fuel QAnon believers to assume they’re on to something,” the piece says.

According to the MIT Technology Review piece, the best ideas to limit QAnon would need radical change and soul-searching from people who run companies that have facilitated it on their respective platforms. But after Twitter also said in July that it would not automatically apply its new policies against political leaders who promote QAnon content and many of whom are in the run for an office this fall, one gets a fair idea that the problem is far from over.

Trump's soft stance on QAnon

President Trump has given enough hint that he will not condemn the QAnon movement even if its supporters were thrilled that he tested positive for coronavirus for a reason. In August, he declined to disavow the movement saying its followers opposed the violent protests and that he has heard that “these are people who love our country”. When he was asked by a reporter that the movement’s followers think the president is fighting a satanic cult of pedophiles and cannibals, Trump counter-asked: “Is that supposed to be a bad thing?” He also conceded then that he did not know much about QAnon’s followers other than that they like him very much and he appreciates them for that.

Also in a town hall that he attended on Thursday, Trump spoke on QAnon when asked by the moderator. He refused to denounce the group or debunk its theories and said: “I know nothing about QAnon. … I know nothing about it. I do know they are very much against pedophilia.” He then counter-asked the moderator Savannah Guthrie saying why she did not ask his presidential rival, Joe Biden, the reason for him not denouncing Antifa and “the Radical Left”.

If you have a news scoop or an interesting story for us, please reach out at (323) 421-7514