startups

Facebook already removed 1.5 million copies of New Zealand attack video – CNN


Facebook said Sunday that it removed 1.5 million videos of the New Zealand attacks around the globe. It blocked 1.2 million of the videos at upload, meaning they would not have been seen by users. Facebook did not say how many people had seen the remaining 300,000 videos.

New Zealand police alerted Facebook to the livestream, and Facebook said it quickly removed the shooter’s Facebook and Instagram accounts and the video. Facebook also said it was “removing all edited versions of the video that do not show graphic content,” as well as praise or support for the shooting.

Facebook’s failure to catch the video before being alerted by police comes amid repeated pledges from the company about moderating content on its platform. Facebook (FB), Twitter (TWTR) and other tech giants are under intense scrutiny for how they are used to spread misinformation, hate speech and graphic content.

In response, Facebook says it has hired tens of thousands of human moderators and is investing in artificial intelligence to help police their sites.

“We continue to work around the clock to remove violating content using a combination of technology and people,” said Mia Garlnick, spokesperson for Facebook New Zealand.

US Senator Mark Warner, who sits on a committee that has questioned the social media companies, said on Friday that it wasn’t just Facebook that needed to be held accountable.

Facebook says it's policing its platform, but it didn't catch a livestream of a massacre. Why?

“The rapid and wide-scale dissemination of this hateful content – live-streamed on Facebook, uploaded on YouTube and amplified on Reddit – shows how easily the largest platforms can still be misused. It is ever clearer that YouTube, in particular, has yet to grapple with the role it has played in facilitating radicalization and recruitment,” Warner said in a statement provided to CNN.

Other companies made pledges to closely monitor their platforms in the wake of the attacks.

YouTube did not say how many videos of the attacks it has removed, but said on Twitter Thursday that it was “working vigilantly to remove any violent footage.”
Twitter also posted a statement online Friday: “We are continuously monitoring and removing any content that depicts the tragedy, and will continue to do so in line with the Twitter Rules,” the company said.





READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.