TikTok will soon be safer for minors to use.
The popular social media platform recently announced it is currently working on a new system to organize content based on thematic maturity and protect minors from consuming content they should see in the first place.
TikTok's efforts to make its platform safe for minors follow many lawsuits against the company for promoting dangerous content that led to the deaths of various minors and the calls of children's rights and digital rights advocacy organizations for better protection for minors.
TikTok 'Content Levels' System Details
TikTok mentioned in its announcement that it is currently working on a new system to organize the content on its platform based on TikTok's unique approach to thematic maturity.
Thematic maturity is a system that would be familiar to people in the film industry, television, and gaming. Think of it as the rating films and games get to advise parents or users of the content expected on such media.
As part of its efforts to make its platform safer for minors to use TikTok will slowly begin to introduce an early version of its "Content Levels" system to help prevent content with "overtly mature themes" from reaching audiences between ages 13-17.
Keep in mind that TikTok could only be used by minors aged 13 years old, according to TikTok's Safety Resources for Parents, Guardians, and Caregivers.
This new Content Level system works like the rating system used in the film and video game industry. When TikTok's algorithm detects a video containing mature or complex themes, a maturity score will be assigned to the video to help prevent those TikTok users under the age of 18 from viewing it.
These mature or complex themes include fictional scenes too frightening or intense for younger audiences, events that may reflect personal experiences, or real-world events that are intended for older audiences, per Tech Crunch.
Meanwhile, the maturity score for TikTok's videos will be assigned by Trust and Safety moderators to videos that are increasing in popularity or those that had been reported on the app, per TikTok's correspondence with Tech Crunch.
Unfortunately, TikTok has not disclosed how it would determine these maturity scores, or what criteria it would use to classify its videos, er Engadget.
The Content Level system is expected to be expanded over time to offer filtering options for the entire TikTok community, not just teens.
Meanwhile, TikTok acknowledges that what it's trying to achieve with its new Content Level system is complex and that it may make some mistakes in its execution.
However, it assures people that its focus remains to be the creation of "the safest and most enjoyable experience for [its] community."
TikTok's Inspiration For Its Content Level System
TikTok was recently singled out in a recent Fairplay study that found its platform to have "design discrimination" against children not from Europe. 39 Children's Rights and Digital Rights Advocacy Organizations have signed a letter to TikTok's CEO, Shou Zi Chew, urging him to address the discrepancies and inconsistencies highlighted in Fairplay's report.
Another probable reason for TikTok's move was the increasing number of lawsuits involving parents who sued the company for recommending a video featuring deadly challenges to their children, who died after participating in them.
Furthermore, a former TikTok content moderator is also suing the social media platform for emotional distress due to inappropriate content featured on its platforms, such as child pornography, rapes, beheadings, and animal cruelty.