LEBANON, N.H./CHRISTCHURCH (Reuters) – Global leaders criticized social media giants Facebook, Twitter, and Google over their handling of extremist content on their platforms on Friday, after video footage of mass shootings at two mosques in New Zealand was live streamed and widely shared online.
An injured person is loaded into an ambulance following a shooting at the Al Noor mosque in Christchurch, New Zealand, March 15, 2019. REUTERS/SNPA/Martin Hunter
Footage of the attacks, which left 49 dead in New Zealand’s worst-ever mass shooting, was broadcast live to Facebook and then reshared by users on other platforms.
Hours after the attack, copies of the video were still available on Facebook, Twitter and Alphabet Inc’s YouTube, as well as Facebook-owned Instagram and WhatsApp.
Democratic U.S. Senators Cory Booker and Mark Warner, criticized the companies as being too slow in taking down the post.
“Tech companies have a responsibility to do the morally right thing. I don’t care about your profits,” Booker, who is running for president, said at a campaign event in New Hampshire. “This is a case where you’re giving a platform for hate. That’s unacceptable, it should have never happened, and it should have been taken down a lot more swiftly.”
Warner highlighted the speed and scope of how the material was shared. “The rapid and wide-scale dissemination of this hateful content – live-streamed on Facebook, uploaded on YouTube and amplified on Reddit – shows how easily the largest platforms can still be misused,” Warner said in a statement. “It is ever clearer that YouTube, in particular, has yet to grapple with the role it has played in facilitating radicalization.”Facebook, Twitter and YouTube all said they were taking action to remove the videos.
“Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Facebook tweeted. “We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”
Twitter said it had “rigorous processes and a dedicated team in place for managing exigent and emergency situations” such as this. “We also cooperate with law enforcement to facilitate their investigations as required,” it said.
YouTube said: “Please know we are working vigilantly to remove any violent footage.”
The videos show the gunman driving to one mosque, entering and shooting randomly at people inside. Reuters was unable to confirm the authenticity of the footage.
Former New Zealand Prime Minister Helen Clark in televised remarks said social media platforms had been slow to close down hate speech. “What’s going on here?” she said, referring to the shooter’s ability to livestream for 17 minutes. “I think this will add to all the calls around the world for more effective regulation of social media platforms,” she added.
Britain’s interior minister also spoke out.
“You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms,” Interior Minister Sajid Javid wrote on Twitter. “Take some ownership. Enough is enough.”
Live-streaming services have become a central component of social media companies’ growth strategy in recent years, but they are also increasingly exploited by some users to livestream offensive and violent content.
In 2017, a father in Thailand broadcast himself killing his daughter on Facebook Live. After more than a day, and 370,000 views, Facebook removed the video. That same year, a video of a man shooting and killing another in Cleveland, Ohio, also shocked viewers.
Reporting by Joseph Ax in New Hampshire and Charlotte Greenfield in Christchurch, New Zealand; Additional reporting by Diane Bartz in Washington; Writing by Peter Henderson, Miyoung Kim and Jack Stubbs; Editing by Nick Macfie, Toby Chopra and Leslie Adler