YouTube executives ignored warnings of 'toxic' content, wanted high user engagement

YouTube employees showed higher ups about a "mass of false, incendiary and toxic content" over the last few years.

As the world still reels from the horrifying massacre in a Christchurch mosque last month, questions are being raised as to why such an event found its way on social media channels. The shooting was live streamed for 17 minutes before being shut down and the clip instantly became viral.

YouTube executives ignored warnings of toxic content, wanted high user engagement

Representative Image.

YouTube, Facebook, and other tech giants have said that they are trying their best to control the spread of extremism on their respective platforms. However, a new report has some troubling insight.

(Also Read - New Zealand mosque shooting: Why couldn't tech companies stop the video from going viral?)

As per Bloombergtop YouTube executives ignored warnings of toxic videos on the platform and let them run rampant. The report claims that the executives were more concerned with viewer engagement rather stopping hateful or extremist or conspiracy-related content.

The report notes that scores of YouTube employees showed higher ups about a "mass of false, incendiary and toxic content" over the last few years. Suggestions were also given so as to curb the spread of such content, but they were dismissed for the sake of viewer engagement.

Susan Wojcicki, CEO of YouTube, is said to be "inattentive to these issues and that the company prioritizes engagement above all else," according to Bloomberg. YouTube's AI is also allowing the spread of this fake news and lets it flourish, the report said.

However, a YouTube spokesperson has said that the company has spent the last two years to find solutions for its content problems.

In an emailed statement to Bloomberg the YouTube spokesperson said "We’ve taken a number of significant steps, including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies — we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority."

Even so, an Engadget report has said that a few years ago a privacy engineer's suggestion of not recommending videos to users that were on the edge of YouTube's privacy policy, was rejected. It was, however, employed this January.

With the Indian General Elections fast approaching, it becomes imperative for platforms such as YouTube to keep a check on fake news.

Tech2 is now on WhatsApp. For all the buzz on the latest tech and science, sign up for our WhatsApp services. Just go to Tech2.com/Whatsapp and hit the Subscribe button.





Top Stories


also see

science