Technology

YouTube tightens rules on conspiracy videos, but stops short of banning QAnon

YouTube CEO Susan Wojcicki speaks during the opening keynote address at the Google I/O 2017 Conference at Shoreline Amphitheater on May 17, 2017 in Mountain View, California.

Justin Sullivan | Getty Images

Google‘s YouTube is updating its hate speech policy to ban videos that target individuals or groups with conspiracy theories that have been used to incite violence, such as QAnon.

“Today we’re further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence,” a spokesperson said in a statement to CNBC. “One example would be content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.”

For example, YouTube would take down videos falsely accusing Democratic presidential candidate Joe Biden of being involved in pedophilia, a baseless claim that’s related to the QAnon theory and that’s been promoted by prominent Republican leaders, including Donald Trump Jr.

The change stops short of a total ban on QAnon, which echoes YouTube CEO Susan Wojcicki’s stance in a recent CNN interview. YouTube told CNBC that it doesn’t consider QAnon a monolithic entity that can be banned — but one that has sprawling tentacles sometimes interwoven with truth, gray areas or under a pseudonym.

Called a potential source of domestic terrorism by the FBI, QAnon is a baseless conspiracy theory that claims President Donald Trump is engaged in a secret battle to stop a global pedophile ring involving many prominent celebrities and Democrats. Recent QAnon posts have also spread false information about voting and about Covid-19, even sparking claims that the president faked his diagnosis of Covid-19 in order to orchestrate secret arrests. 

The policy update comes as social media companies face pressure to contain misinformation — especially those catapulted by the rise of group QAnon ahead of the U.S. elections and during Covid-19. Facebook earlier this week said it would ban all QAnon groups as dangerous, and Twitter has also cracked down on QAnon-related content.

YouTube added that it has removed tens of thousands of QAnon videos and terminated hundreds of channels under existing policies, including those that threaten violence.

Earlier this year, the company updated its “harmful and dangerous” policy to begin removing content that contains Covid-19 misinformation, such as claims that 5G causes the coronavirus or that masks “activate” the virus.

WATCH NOW: Facebook cracks down on QAnon across all platforms

View Article Origin Here

Related Articles

Back to top button