In 2022, Apple abandoned plans to detect Child Sexual Abuse Material (CSAM) following allegations that the system could be abused to monitor users.
Instead, Apple turned to a set of features called Communication Safety, which blurs out nude photos sent to children. However, these measures are not enough to completely prevent abuse and reporting of CSAM incidents.
According to The Guardian, the UK 's National Society for the Prevention of Cruelty to Children (NSPCC) said Apple underestimated many CSAM incidents on its services such as iCloud, FaceTime and iMessage.
All US technology companies are required to report detected cases of CSAM to the National Center for Missing & Exploited Children (NCMEC), and in 2023, Apple made 267 reports. fox.
However, the NSPCC found that in England and Wales alone, Apple was involved in 337 breaches between April 2022 and March 2023.
Richard Collard, head of online child safety policy at the NSPCC, expressed concern: “There are worrying discrepancies between the number of child abuse graphic crimes in the UK taking place across services. Apple and the almost insignificant number of global reports of abusive content they send to authorities. Apple is clearly behind many of its peers in addressing child sexual abuse when all tech companies should be investing in safety and preparing for the implementation of the Online Safety Act. route in the UK.
Compared to Apple, other technology companies have reported significantly higher numbers of CSAM cases. Google reported more than 1,470,958 cases in 2023. During the same period, Meta reported 17,838,422 cases on Facebook and 11,430,007 cases on Instagram. Notably, Meta's WhatsApp, which is also encrypted like Apple's iMessage, reported approximately 1,389,618 suspected CSAM cases in 2023.
This disparity has raised many questions about Apple's commitment to protecting children on its platform. Some child abuse experts worry that Apple has not been effectively responding to the problem, and that its lack of transparency in its reporting could make the situation worse.
Meanwhile, Apple only referred to The Guardian its previous statements about overall user privacy, without clarifying the much lower number of reports compared to other companies.
Another factor also mentioned is the concern about AI -generated CSAM images. As technology becomes more and more powerful, the ability to create realistic images from artificial intelligence also increases. However, Apple Intelligence, which is expected to launch, will not create these realistic images, alleviating some of the concerns from the community.
In the context of increasing pressure from the public and child protection organizations, Apple will need to take more specific and transparent measures to deal with the issue of child sexual abuse on its services. mine.
A lack of transparency and adequate reporting could have serious consequences, not only for the company but also for the millions of children who use Apple services around the world. Investing in effective protection and monitoring measures will be a necessary step to ensure the safety of young users, while maintaining Apple's reputation and responsibility in the technology industry.