tech2 News StaffFeb 27, 2019 14:43:51 IST
In a shocking revelation, YouTube Kids, the video app that curates select content for children is now being questioned on how safe the 'child-friendly' version of the online video platform really is for the little ones.
While the video app is supposed to be a safe hub for kids, disturbing videos on the app have been reported by parents that are said to introduce children to concepts of suicide.

Representative Image.
As reported by The Washington Post, Free Hess, a mom and Florida-based paediatrician raised the alarm about disturbing videos after a friend noticed one on YouTube Kids app. Hess claimed that she found video clips on YouTube and YouTube Kids that showed inappropriate content and gave instructions to children on how to commit suicide.
She specifically points to a nine-second video clip which initially began as an 'innocent cartoon' was interrupted by a man giving advice on how to properly slit your wrist. However, the video has now been taken down from the platform.
The suicide clip is on #youtubekids as well. @YouTube you need to do better. #YouTubeWakeUp #ProtectOurKids #ParentsDemandAction #thisisnotok pic.twitter.com/YdNqsEcxtD
— PediMom, Dr. Free N. Hess (@thepedimom) February 24, 2019
"My research has led me into a horrifying world where people create cartoons glorifying dangerous topics and scenarios such as self-harm, suicide, sexual exploitation, trafficking, domestic violence, sexual abuse and gun violence which includes a simulated school shooting. All of these videos were found on YouTube Kids, a platform that advertises itself to be a safe place for children 8 years old and under," Hess wrote on her PediMom blog.
Besides the disturbing video on suicide, Hess shared a few other videos from YouTube Kids that glorified human trafficking, domestic violence and 'simulated school shootings.' Notably, seven months earlier Hess had shared a complaint from another mother who had found a similar pattern in videos hosted on the YouTube Kids app.
YouTube's inability to police its own content is now as bad as Facebook's inability to manage user privacy. Just last week, a damning report revealed that YouTube was home to a massive paedophile network involving hundreds of channels and potentially, thousands of members. The horrifying report resulted in advertisers pulling their ads from the platform and the shutting down of over 400 channels.
Be warned, the video below is disturbing, but if you're a concerned parent, you need to watch it to understand just how serious the issue is.
YouTube Kids was started as a platform specifically for kids and Google promised that it would feature only "hand-picked", "whitelisted" content that was kid-friendly. Clearly, Google has simply left this job to its algorithms. Even in 2017, we saw reports of beloved characters like Peppa Pig and Spiderman being portrayed in violent and lewd videos on the YouTube Kids platform.
YouTube's response to these reports is downright insulting, attempting to disclaim responsibility by claiming that the sheer volume of content to be policed is immense.
As told to the Washington Post, YouTube claims that they're working to ensure that the platform is “not used to encourage dangerous behaviour and we have strict policies that prohibit videos which promote self-harm.” The spokesperson also added that “Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report [transparencyreport.google.com] and give users a dashboard showing the status of videos they’ve flagged to us.”
While it is certainly true that the sheer volume of content on the platform poses a nigh-insurmountable challenge, a platform that's specifically designed for children must be held to a higher standard. It's either a safe place for kids or it isn't. Would you send your kids to a school where there's even a 1 percent chance that it is courted by paedophiles? When it comes to the safety of children, there is no middle ground, and if Google can't appreciate that, it has no right to be hosting or promoting a platform that targets children.
Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.