Video chat app TikTok is becoming a breeding ground for online predators who have been sending explicit and sexual messages to young children on the platform according to a BBC Trending investigation.
There have been hundreds of these explicit comments which have been found on videos posted on the platform by children as young as nine.
The company was able to delete most of these comments after it was reported. However, it was seen that most of the offenders who had posted these comments were still present on the platform, despite TikTok having strict rules against sexual messages directed at children.
The investigation has been carried out in the UK.
How Serious is the Problem?
TikTok is one of the most downloaded video-chat apps and has more than 500 million monthly active users around the world.
The app allows users to create and post short videos of themselves lip-syncing to songs and movie dialogues while completing small challenges.
During the three-month investigation, BBC Trending came across many sexual comments and even used the in-app tool to report this to TikTok.
The app’s community guidelines state that it does not allow “public posts or private messages to harass underage users”. It adds that if the company becomes “aware of content that sexually exploits, targets, or endangers children” it may choose to “alert law enforcement or report cases”.
In India, the Madras High Court has also set a precedent where it has asked the centre to ban TikTok, saying it spoils the future of youth and minds of children.
While majority of the sexual comments were taken off the platform within 24-hours after it was reported, the app did not manage to remove these account which were considered inappropriate for children.
Further into the investigation, it was found that there were many other accounts on TikTok which were sending similar messages to young kids and posting explicit comments on their videos.
The report also mentions that many of the users prefer hiding behind anonymous platforms while posting comments. However, there are many users who use their real names and pictures and even upload their own videos while posting replies and comments.
The investigation also found instances where some of the children received threatening or violent messages.
Some Kids on the Platform Too Young
It has been found that some of the children on the platform are too young. Some even under the age of 13 years.
Earlier this year, TikTok has been hit with a heavy fine of 5.7 million dollars (Rs 39 Cr approx.) in the US for illegally collecting data of users who were children under the age of 13 years.
This happened despite the app’s age restriction, where users have to be over 13 years old.
The BBC Trending investigation found many users under the age of 13 years still using the platform. TikTok hasn’t been able to verify the user’s age yet. However, it is looking at using AI and facial recognition technology to help curb this issue.
How TikTok Responded
In response to the above, Tik Tok released a statement saying:
“We have a dedicated and growing team of human moderators to manually cross review tens of thousands of videos and accounts, and we constantly roll out internal trainings and processes to improve moderation accuracy and efficiency. While these protections won’t catch all instances of misuse, we’re committed to improving and enhancing our protective measures, and we use learnings like these to continually hone our moderation efforts.”TikTok Statement
Additionally, it said it welcomed media and third-party suggestions to improve the platform. The company also mentioned that it uses a combination of technology and human moderation to remove content. However, it refused to comment on how many moderators it has employed.
— with inputs from BBC Trending
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)