Has TikTok been recommending porn and highly sexualised content to children? A new report by a human rights group has said that the popular social media app did so for accounts it created.
The app, which is owned by China’s ByteDance, has long been under the scanner from rights groups, parents and authorities about its possible effects on children. Now, a group has claimed that the algorithm recommended explicit material to accounts created for minors earlier this year.
The United States is currently finalising a purchase of TikTok from China. A consortium led by, among others, Oracle CEO Larry Ellison is set to gain ownership of the app. There is speculation that America will get a separate TikTok app from the rest of the world.
But what happened? What do we know?
Let’s take a closer look:
What happened?
The research was carried out by human rights campaign group Global Witness. Members of the group, which describes itself as an investigative, campaigning organisation, created accounts on TikTok between July and August.
Global Witness on its website said it usually investigates how Big Tech companies’ business models threaten democratic discourse, human rights and the climate crisis. While it does not usually research online harm to children, it claimed TikTok’s search bar kept offering up
sexually explicit search terms during a previous
investigation.
Global Witness said it immediately reported this to the company. TikTok responded, “We have reviewed the content you shared and taken action to remove several search recommendations globally.”
However, Global Witness later claimed that TikTok had not resolved the issue. So, it launched a follow-up investigation. It did so after the new Online Safety Act (OSA) came into effect in the UK, the aim of which was to protect children from seeing pornography and other harmful material such as suicide, self-harm, and eating disorders online.
The law mandates that websites hosting some pornographic content need to implement age-verification checks to stop children accessing them by July 25. Websites were told to verify the age of the users via a number of mechanisms including credit card checks, photo IDs and even selfies.
The law calls for massive fines for websites failing to do so – $24 million (Rs. 212 crore) or 10 per cent of their global revenue, whichever is higher. It also warned that executives of the websites could be jailed if they did not comply with these terms.
How did they do it?
Global Witness researchers, pretending to be 13-year-olds, set up four accounts using false dates of birth. The members of the group claimed that the app did not ask them for any additional information – nor did it seek to authenticate their details.
All the accounts were set up using factory-reset phones with no search histories. The researchers then activated the safety settings including ‘restricted mode’. TikTok says that this mode stops its users from seeing “mature or complex themes, such as… sexually suggestive content”.
However, the researchers discovered still seeing overtly sexualised searches in the app’s ‘You May Like’ section. Following some of these terms resulted in videos coming up of women simulating masturbation.
Other content showed women flashing their underwear in public spots or showing their breasts. It even showed explicit pornographic videos like penetrative sex. Researchers said these videos were inserted into non-explicit material to dodge content moderation.
Ava Lee of Global Witness told the BBC the developments were a “huge shock” to researchers.
“TikTok isn’t just failing to prevent children from accessing inappropriate content – it’s suggesting it to them as soon as they create an account,” Lee claimed. “Everyone agrees that we should keep children safe online… Now it’s time for regulators to step in.”
Britain’s media regulator Ofcom is in charge of enforcing the law. Ofcom previously warned that it could take websites that do not adhere to these rules and regulations to court and stop them from being accessed in the UK. Ofcom had opened probes over two dozen websites for potentially violating the OSA.
TikTok is one of the most popular social media apps across the world – particularly with young people. Around half of Gen Z users prefer to use TikTok over Google.
Ofcom says its data shows that children aged eight to 17 spend between two and five hours daily. The data shows that almost all children over the age of 12 have mobile phones. Nearly all of them visit platforms such as YouTube and TikTok to watch videos.
It said TikTok was most popular among eight to 11-year-olds who used social media. This despite TikTok on its website saying users must be at least 13 to use the website. TikTok has said it has over 50 features designed to protect teens. It says nine out of 10 videos that violate its guidelines are removed before they are ever seen.
With inputs from agencies