UK watchdog accuses Apple of failing to effectively monitor for underage sexual images

The UK’s National Society for the Prevention of Cruelty to Children says Apple has been undercounting how often child sexual abuse material appears on its platforms. According to a The Guardian story, Cupertino isn’t reporting the actual number of CSAM detections on its products.

As tech companies are obligated to share possible child sexual abuse material on their platforms with the government, Apple made 267 reports of suspected CSAM to the National Center for Missing & Exploited Children between April 2022 and March 2023. However, the NSPCC discovered that in this period, Apple was implicated in 337 recorded offenses of child abuse images.

That said, Cupertino made fewer reports than the cases that happened solely in England and Wales. In contrast, other big techs have bigger numbers. Meta reported more than 30.6M suspected CSAM material in its platform, including 1.4M in WhatsApp. Google also reported more than 1.47M materials.

It’s unclear why Apple is undercounting possible CSAM cases. However, in late 2022, the company abandoned its plans to roll out an iCloud photo-scanning tool after a backlash that it would surveil users instead of only looking for possible child sexual abuse material.

Tech. Entertainment. Science. Your inbox.

Sign up for the most interesting tech & entertainment news out there.

By signing up, I agree to the Terms of Use and have reviewed the Privacy Notice.

The company then said it prioritized users’ security and privacy and would roll out other features to protect children instead. Still, the UK’s watchdog points out that even though WhatsApp has end-to-end encryption like iMessage, the platform has reported infinitely more cases than Apple.

To The Guardian, Sarah Gardner,  chief executive officer of Heat Initiative, a Los Angeles non-profit focused on child protection., said: “Apple does not detect CSAM in the majority of its environments at scale, at all. They are clearly underreporting and have not invested in trust and safety teams to be able to handle this.”

While Cupertino didn’t comment on the story, it’s unclear if Apple will have to respond to the UK watchdog or if it plans to change its approach to CSAM.

BGR will keep following this story’s development.

Source

Shopping Cart
Scroll to Top