Facebook said Sunday that it removed 1.5 million videos of the New Zealand attacks around the globe. It blocked 1.2 million of the videos at upload, meaning they would not have been seen by users. Facebook did not say how many people had seen the remaining 300,000 videos.
New Zealand police alerted Facebook to the livestream, and Facebook said it quickly removed the shooter’s Facebook and Instagram accounts and the video. Facebook also said it was “removing all edited versions of the video that do not show graphic content,” as well as praise or support for the shooting.
In response, Facebook says it has hired tens of thousands of human moderators and is investing in artificial intelligence to help police their sites.
“We continue to work around the clock to remove violating content using a combination of technology and people,” said Mia Garlnick, spokesperson for Facebook New Zealand.
US Senator Mark Warner, who sits on a committee that has questioned the social media companies, said on Friday that it wasn’t just Facebook that needed to be held accountable.
“The rapid and wide-scale dissemination of this hateful content – live-streamed on Facebook, uploaded on YouTube and amplified on Reddit – shows how easily the largest platforms can still be misused. It is ever clearer that YouTube, in particular, has yet to grapple with the role it has played in facilitating radicalization and recruitment,” Warner said in a statement provided to CNN.
Other companies made pledges to closely monitor their platforms in the wake of the attacks.