New Zealand terror attack: 200 watched live-stream of massacre, video was viewed 4,000 times before it was taken off
In the week since 50 people died in the New Zealand mosque terror attacks, social media giants like Facebook and YouTube have faced global criticism for regulations on content that are not strict enough, and a slow response to their platforms being used by perpetrators of violence.
In the week since 50 people died in the New Zealand mosque terror attacks, social media giants like Facebook and YouTube have faced global criticism for regulations on content
The tech companies faced backlash after one of the gunmen who opened fire at two mosques in Christchurch, live-streamed 17 minutes of the massacre on Facebook
In a statement, the Mark Zuckerberg-led company said it was making efforts to counter hate speech and 'the threat of terrorism'
In the week since 50 people died in the New Zealand mosque terror attacks that was live streamed by the attacker, social media giants like Facebook and YouTube have faced global criticism for regulations on content that are not strict enough, and a slow response to their platforms being used by perpetrators of violence.
The tech companies faced backlash after one of the gunmen who opened fire at two mosques in Christchurch used a go-pro camera to live-stream 17 minutes of the massacre on Facebook, and it was hours before the video completely vanished from the cyberspace. The three social media giants, Facebook, Twitter and Google, scrambled to take down the video, but not before several copies of it were made and circulated on the internet.
On Monday, Facebook released an "update" on the incident and the technical causes behind the delay in action on the company's part.
In a statement, the Mark Zuckerberg-led company said it was making efforts to counter hate speech and "the threat of terrorism". "We remain shocked and saddened by this tragedy and are committed to working with leaders in New Zealand, other governments, and across the technology industry to help counter hate speech and the threat of terrorism."
Facebook also said it was "working around the clock" to prevent hateful content from finding space on the platform, "using a combination of technology and people". The company gave pointers on the efforts taken in cooperation with the New Zealand Police to "support their investigation":
- The company claimed that the video was viewed fewer than 200 times during the live broadcast. However, it pointed out that the video streamed unnoticed because no users reported the video while the live broadcast was on. The video was viewed 4,000 times in total before it was removed from Facebook, including the views during the live.
- The first report against the video was lodged 29 minutes after the video had started, and 12 minutes after the live broadcast had ended. The video was also posted on 8chan, a platform to share files, by an unidentified user, before it was taken down by Facebook.
- The accounts of the "named suspect" on Facebook and Instagram was taken down and the company was "actively" trying to identify "imposter accounts".
- "In the first 24 hours, we removed about 1.5 million videos of the attack globally. More than 1.2 million of those videos were blocked at upload, and were therefore prevented from being seen on our services," the statement said.
- The company also said that it was working in collaboration with member organisations of the Global Internet Forum to Counter Terrorism (GIFCT).
- Facebook also said that some variants of the video, like screen recordings, were harder to detect, so additional detection systems like the use of audio technology were used to tackle the problem.
- Facebook has also identified how such content might "migrate" from other platforms in a bid to tackle it.
The statement also said, "We removed the attacker’s video within minutes of their outreach to us, and in the aftermath, we have been providing an on-the-ground resource for law enforcement authorities. We will continue to support them in every way we can. In light of the active investigation, police have asked us not to share certain details of what happened on Facebook."
However, publications like TechCrunch have found that despite its claim that 1.5 million videos of the attack were removed within 24 hours, the failure to stop the uploads of such content amounted to at least 20 percent failure rate to detect such content at upload.
According to the report, Facebook failed to answer why 300,000 videos were not caught at upload. In addition to that, reportedly the number shared by Facebook is only of videos that it is aware of. Apparently many videos were found posted on Facebook over 12 hours after the attack, reports said.
New Zealand Prime Minister Jacinda Ardern said Facebook’s chief operating officer Sheryl Sandberg has sent condolences over the shootings at two mosques that killed 50 people, some of which were live-streamed over the social media platform.
With inputs from agencies
Cook feels New Zealand's match sharpness will work in the favour of Kane Williamson and company.
Tendulkar also stressed on the need of making Test cricket 'a bowler-dominated' format so that there is more engagement.
World Test Championship Final: Rohit Sharma might have problems against the swinging ball, says Scott Styris
Head groundsman at Southampton, Simon Lee has made it clear that they are aiming to produce a fast and bouncy pitch, and Styris believes it can spell trouble for Rohit.