In the aftermath of Election Day, YouTube has been under fire for spreading misinformation as videos falsely claiming victory for President Donald Trump make their way onto the platform.
The Google-owned website has been criticized for refusing to take down videos by One America News, a far-right news organization. On Wednesday, the outlet posted a video in which anchor Christina Bobb baselessly alleges Democrats are “tossing Republican ballots, harvesting fake ballots, and delaying the results to create confusion.”
On Thursday, OAN doubled down with another video, falsely claiming Democrats are “trying to steal the battleground states.” YouTube said it will no longer show ads on both videos, taking away their ability to generate revenue. Both videos are , “Results may not be final,” which appear with all election-related videos and search results.
In the days since the election, ballots are still being counted in states including Arizona and Pennsylvania. Despite the false claims in the OAN videos, YouTube said they don’t violate the platform’s rules.
“Our Community Guidelines prohibit content misleading viewers about voting, for example content aiming to mislead voters about the time, place, means or eligibility requirements for voting, or false claims that could materially discourage voting. The content of this video doesn’t rise to that level,” said Ivy Choi, a YouTube spokesperson.
OAN didn’t respond to a request for comment.
YouTube’s stance underscores the limits of YouTube’s policing of misinformation on its platform. While the world’s largest video site has rules against promoting voter suppression, it doesn’t have the same kinds of protections against false claims of victory, in what has been the most contentious US election in recent history. The platform’s only safeguard for that is its panel about the results not being final, which link to real-time tabulations by the Associated Press.
Tech giants have been game-planning for months how they’d handle election misinformation. Silicon Valley companies including Google, Facebook and Twitter have been eager to prove they could avoid the mistakes they made in 2016, when that election was marred by interference by Russian agents that exploited big tech platforms.
In September, YouTube said it would show people information panels on mail-in voting when they watch videos that discuss the subject. The ballot-casting method has become fraught with misinformation as Trump has tried to discredit the process, while providing no evidence of security flaws in the time-tested system. Last month, the company banned some videos pushing false conspiracies such as QAnon, which claims Satan-worshipping cannibals and pedophiles are trying to take down the president. YouTube pledged to remove content that “targets an individual or group with conspiracy theories that have been used to justify real-world violence.”
YouTube faced challenges right from Election Day. The company took down multiple livestreamshours before the polls closed anywhere in the country, but not before the videos were already viewed by thousands of people.