Apple is no stranger to publicity, but recent months have brought a larger amount of negative press than the company is used to. Announcements of the new methods of scanning devices for child abuse images came under vast amounts of scrutiny due to the technology’s potential for misuse; it was well-intentioned but required a lot of faith in Apple to use their power wisely.
It has since come to light that Apple has been scanning iCloud Mail for child sexual abuse material (CSAM) since 2019, but some worrying findings have revealed that Apple is doing no way near enough to protect children from accessing adult content themselves.
iCloud Mail scanning
An archived version of Apple’s child safety page stated that ‘Apple uses image matching technology to help find and report child exploitation’ that operates much like a spam filter in emails to find suspected child exploitation. Ben Lovejoy of 9to5mac contacted Apple regarding the email scanning and stated that:
Apple ‘has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task. Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.’
Whether or not Apple should be scanning users images is a hotly debated topic and the heat is unlikely to die down any time soon. Some users quite rightly ‘believe that Apple does have a moral and ethical responsibility to prevent CSAM from touching its servers’, while others feel this is an infringement on user privacy and has the potential to be misused in future. This is a slippery slope from whatever angle you look at it.
Potential for CSAM system misuse
Researchers from Princeton University say Apple’s CSAM system is dangerous – and they know because they built one. The research project led by Assistant professor Jonathan Mayer and graduate researcher Anunay Kulshrestha explored whether it was possible to preserve encryption and privacy whilst also tackling the proliferation of CSAM on encrypted messaging platforms.
‘We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing.’
The project rapidly ground to a halt when they saw how easy it would be to repurpose their program for surveillance and censorship. Writing in The Washington Post, the researchers stated that ‘the design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.’
Apple has allegedly stated it would not allow this kind of use of its system, but if urged by governments to allow access the tech giant would have little choice. Despite Apple’s efforts to downplay the criticisms of privacy campaigners, an open letter urging them to roll back the decision has been signed by thousands.
Child protection in the app store
Against the backdrop of all these efforts to protect children, Apple has come under fire for practices that do the polar opposite. Apple promises that the App Store is safe for kids, but an investigation by the Tech Transparency Project (TTP) has exposed flaws in the child protection measures used by the tech giant:
‘The investigation reveals major holes in the App Store’s child safety measures, showing how easy it is for young teens to access adult apps that offer dating, random chats, casual sex and gambling, even when Apple knows the user is a minor,’
Apple has made efforts to police the content available to minors through the App Store, with mixed levels of success. Apple reportedly rejects apps that are ‘over the line – especially when it puts children at risk’, but have failed to implement measures that truly protect children from the adult apps available. The TTP found that it was worryingly easy to circumvent the age restrictions on apps:
‘Using an Apple ID for a simulated 14-year-old, TTP examined nearly 80 apps in the App Store that are limited to people 17 and older—and found that the underage user could easily evade age restrictions in the vast majority of cases.’
Despite the account being registered to a user with a self-declared age of 14, 17+ apps could be downloaded by simply tapping ‘OK’ on the prompt to confirm that you are 17 or over. The full TTP report details exactly what content they were able to download but in brief summary, the troubling findings included: ‘a dating app that opened directly to pornography before asking a user’s age; adult chat apps filled with explicit images that never asked the user’s age; and a gambling app that let the minor account deposit and withdraw money.’
So, while Apple may be making serious efforts to police user content, they are doing little to protect children from the apps available to download from their own store. Allowing users to access age-restricted content by simply clicking a button seems like quite the error, especially when the user has previously provided an age that indicates they are not old enough to access adult material. While Apple polices existing images they should also be limiting access to platforms that may lead to the creation of the very material they are attempting to eradicate.
Recommended for you

Antidepressant Prescribing at Six-Year High
More people are taking antidepressants than ever. Is this a dark sign of the times or an indication that mental health stigma is changing?

Can AI be Used to Determine Cancer Recurrence?
When cancer patients go into remission, they often worry about it coming back. AI can now help identify those at risk of cancer recurrence.

Pegasus – Still a Threat to the UK?
The notorious Pegasus spyware has been misused to exploit vulnerabilities in devices, even those kept within the walls of Number 10.
Trending

Drug Decriminalisation: Could the UK Follow Portugal?
Portugal’s drug decriminalisation has reduced drug deaths and made people feel safe seeking support. Would the UK ever follow suit?

Calling All Unvaccinated UK Adults
With Covid cases rising, the NHS is urging the 3 million UK adults who remain unvaccinated to come forward.