Apple Accused of 'Underreporting' Child Abuse, UK Authorities Reveal

Apple is facing accusations of "underreporting" child sexual abuse materials (CSAM) from the United Kingdom's National Society for the Prevention of Cruelty to Children (NSPCC).

The data showed how the tech giant is failing to flag CSAM content on its flagship services like iCloud, iMessage, and FaceTime.

Apple Faces Artist Backlash Over AI Transparency Concerns

(Photo : Drew Angerer/Getty Images)

Read Also: Starting Your Apple Ecosystem: Which Device to Buy First and Why

Apple Allegedly Undercounts CSAM Cases From Services

The NSPCC shared the U.K. police data which revealed that Apple is "vastly undercounting" the instances of CSAM found on its services globally. The police reported that they have logged more CSAM-related cases compared to Apple's record during 2023.

The NSPCC found that 337 CSAM cases were recorded from Apple between April 2022 and March 2023 in England and Wales. However, the tech giant only disclosed 267 cases to the National Center for Missing & Exploited Children (NCMEC), representing its global entries for 2023.

Child safety experts expressed their concern about how Apple is undercounting the incidents of CSAM on its platforms. Large tech companies are required to report records of CSAM on their services to NCMEC once found but Apple only submits a couple of hundred cases annually.

Child Safety Experts Calls for Apple's Major Policy Improvement

NSCPCC's head of child safety Richard Collard shared that Apple must implement major changes to its child safety efforts. He also added that the tech giant is lagging behind other big companies in addressing child sexual abuse.

"There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple's services and the almost negligible number of global reports of abuse content they make to authorities," Collard said.

Another child safety expert and advocate Sarah Gardner stated that the current initiative of Apple towards AI could further intensify its CSAM problems. Gardner, the CEO of US-based child protection organization Heat Initiative, that generative AI would only worsen the situation.

As of writing, Apple has yet to comment on the NSPCC's report.

Related Article: Nearly 10 Billion Passwords Leaked from Global RockYou2024 Cyberattack

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost