advertisement
Tech giant Facebook took to Twitter late on Saturday, 16 March, to say that it had removed 1.5 million videos globally of the New Zealand mosque attack in the first 24 hours after the incident.
"In the first 24 hours, we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload...," Facebook said.
The company also said that it was removing all edited versions of the video that do not show graphic content out of respect for the people affected by the mosque shooting and the concerns of local authorities.
The suspect in the shootings, 28-year-old white supremacist Brenton Harrison Tarrant, had posted a jumbled 74-page anti-immigrant manifesto online before the attacks and had apparently used a helmet-mounted camera to broadcast live video of the slaughter.
Several more people were killed in an attack on a second mosque in the city a short time later, taking the total death toll to 50.
Facebook, Twitter and Google had scrambled to take down the video, which was widely available on social media for hours after the bloodbath.
Five Indians were among the 50 killed in the terror attacks.
(With inputs from AP)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)