New Zealand mosque shooting: Why couldn't tech companies stop the video from going viral?

Silicon Valley was unsuccessful in completely removing the content from their platform. But why?


On 15 March 2019 the world witnessed the darkest side of social media after a shooter opened fire in a mosque in New Zealand's Christchurch and live-streamed his entire 17-minute rampage on Facebook. The shooter had apparently made his intentions known on 8Chan and Twitter before he went on the killing spree. Yet the tech companies watched helplessly as the video of the massacre was circulated around on social media like wildfire.

New Zealand mosque shooting: Why couldnt tech companies stop the video from going viral?

A sign is seen after Friday's mosque attacks outside a community center near Masjid Al Noor in Christchurch. Reuters

Silicon Valley was unsuccessful in removing the content completely from their platform even as they tout next-gen AI and machine learning that has been designed to stop this very thing from happening. How did the tech companies, in which we have entrusted so much of our personal information and on whom we count to put a check on spreading of extremist content, fail so miserably when the time for action came?

To recall this isn't the first time social media has been used to live stream a mass shooting. The news reporters that were shot and killed on camera in Virginia, and the mass shooting in Dallas have all found its way to social media platforms and as was the case on 15 March, the tech companies were a day late and a dollar short as usual.

YouTube, which publicises itself as being on the forefront for taking down copyrighted content on its platform, was unable to take down videos that contained at least a part of the massacre's footage.

As per The Vergeall though exact re-uploads of the massacre video will be taken down, videos which contain a part of the clip are sent to human moderators so that news videos containing the clips aren't removed in the process.

The report also states that YouTube's algorithms, which remove terrorism-related and child pornography content instantly, are not used in the above-mentioned case due to the content's possible 'newsworthiness'.

Facebook has its own tools for removing child pornography content called PhotoDNA with Google developing an open-source API version of the same tool for its own usage. However, as with the case with YouTube, the rules change when an extremist event is live-streamed to an audience or parts of the clip are used in news websites.

Motherboard has found out that once a live video has been flagged on Facebook, moderators will have the ability to " ignore it, delete it, check back in on it again in five minutes". These moderators are told to look for specific actions such as  “crying, pleading, begging” and also “display or sound of guns or other weapons (knives, swords) in any context.” Keeping this in mind it is right now unclear how the Christchurch shooter was allowed to stream for a good 17 minutes before the footage was cut.

A Facebook spokesperson responded to the Wired by saying “Since the attack happened, teams from across Facebook have been working around the clock to respond to reports and block content, proactively identifying content which violates our standards and to support first responders and law enforcement,".

Google's New Zealand spokesperson, apart from offering condolence to the victims of the shooting also said that clips of the massacre which are of newsworthy interest will be shown on Google. This would leave Google with more work to find out which videos are going to be flagged and which will remain.

Rasty Turek, CEO of Pex, a video analytics platform that is also working on a tool to identify re-uploaded or stolen content with YouTube has told The Verge that it is nearly impossible to stop live streams as they happen since the content is always changing.  "You can blame YouTube for many things, but no one on this planet can fix live-streaming right now,” he said.

“The pressure (off removing content) never used to be as high,” Turek said in his interview with The Verge. “There is no harm to a society when copyrighted things or leaks aren’t taken down immediately. There is harm to society here though.”

So what is the solution to the problem? As the world begins to engage more on social media, it becomes pertinent for the big tech giants to search for a way to stop horrific incidents like the Christchurch mass-shooting from being spread through their channels. While it may still be easier to take down uploaded videos, live streams still pose a problem and tech giants will need to find a solution to it sooner rather than later.

 

The Great Diwali Discount!
Unlock 75% more savings this festive season. Get Moneycontrol Pro for a year for Rs 289 only.
Coupon code: DIWALI. Offer valid till 10th November, 2019 .