Newly revealed court documents highlighted Meta's "historical reluctance" to provide safety measures for children using its platforms, particularly in Instagram.
The unredacted documents show that the social media giant took too long to fix the issue of its "people you may know" feature that allowed adult strangers contact minors.
Meta only added safety protections for children in Instagram in 2021, only after an Apple executive reported that their 12-year-old child was solicited on the platform.
Another court document revealed that Meta "underinvested in minor sexualization," categorizing the protective measures as an "explicit non-goal" in 2020.
The lack of prioritization also affects any initiatives against child grooming and preventing underage users on posting sexualized comments from their accounts.
More court documents are expected to be unredacted in the following months as CEO Mark Zuckerberg are scheduled to testify to the US Congress regarding child safety online.
Social Media Platforms Face Criticism Over Sexualization of Minors
Child exposure to adult content and sexual harassment is not an exclusive problem to Meta's platforms as other social media are also set to face the Congress by the end of January.
Snapchat, Discord, TikTok, and X (formerly Twitter) has also been noted of putting its underage users at risk of sexual harassment, unsolicited messages, and sexualization from adult strangers.
According to a study by the Child Crime Prevention & Safety Center, about 500,000 online predators are active every day, targeting children, mostly female, aged 12 to 15 years old.
In 2022, cases of grooming and sexual solicitations have grown more than 80% in the past four years as social media become more readily available to young users.
Instagram has been noted to be the leading platform for child sexual abuse.
US Government Pushes for Further Regulations for Online Child Safety
In response to the growing number of parents complaining, the US government has been taking steps in the past couple of years to finally address the issue.
Just last month, the Federal Trade Commission is encouraging changes to the Children's Online Privacy Protection Act to further protect minors from similar incidents.
The State of California has already passed the California Consumer Privacy Act to prevent social media from collecting data from users under the age of 16 to limit information being available to potential predators.
There is also the Kids Online Safety Act which will directly address child safety on social media. KOSA is still waiting for the approval before it is implemented.
Related Article : FTC Pushes for New Regulations to Protect Kid's Data Privacy