Americas

  • United States

Asia

Did Apple send its controversial CSAM scanning back to the lab?

news analysis
Dec 15, 20214 mins
AppleMobileSecurity

Apple seems to have stepped back on its least popular innovation since the Butterfly Keyboard, deleting mentions of its CSAM scanning/surveillance tech from its site following universal criticism.

Apple messages scam

Apple appears to have stepped back on its least popular innovation since the Butterfly Keyboard, stealthily slicing mentions of its controversial CSAM scanning/surveillance tech from its website following widespread criticism of the idea.

Child protection tools

The company in August announced plans to introduce ‘surveillance as a service’ on iPhones.

At that time, it revealed new communication safety features now available in iOS 15.2 and another tool – including the capacity to scan a user’s devices against a set of data to identify child sexual abuse material (CSAM). If such material was discovered, the system flagged that user up for investigation.

The response was immediate. Privacy advocates across the planet quickly realized that if your iPhone could scan your system for one thing, it could easily be asked to scan for another. They warned such technology would become a Pandora’s box, open to abuse by authoritarian governments. Researchers also warned that the tech might not work particularly well and could be abused or manipulated to implicate innocent people.

Apple tried a charm offensive, but it failed. While some industry watchers attempted to normalize the scheme on the basis that everything that happens on the Internet can already be tracked, most people remained utterly unconvinced.

A consensus emerged that by introducing such a system, Apple was deliberately or accidentally ushering in a new era of on-device universal warrantless surveillance that sat poorly beside its privacy promise.

Tufts University professor of cybersecurity and policy Susan Landau, said: “It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”

While all critics agreed that CSAM is an evil, the fear of such tools being abused against the wider population proved hard to shift.

In September, Apple postponed the plan, saying: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

MacRumors claims all mentions of CSAM scanning have now been removed from Apple’s Child Safety Page, which now discusses the communication safety tools in Messages and search protections. These tools use on-device machine learning to identify sexually explicit images and block such material. They also give kids advice if they search for such information. There has been one change in this tool: it no longer alerts parents if their child chooses to view such items, in part because critics had pointed out that this may pose risks for some children.

It is good that Apple has sought to protect children against such material, but it is also good that it seems to have abandoned this component, at least for now.

I don’t believe the company has completely abandoned the idea. It would not have come this far if it had not been fully committed to finding ways to protect children against such material. I imagine what it now seeks is a system that provides effective protection but cannot be abused to harm the innocent or extended by authoritarian regimes.

The danger is that having invented such a technology in the first place, Apple will still likely experience some governmental pressure to make use of it.

In the short term, it already scans images stored in iCloud for such material, much in line with what the rest of the industry already does.

Of course, considering the recent NSO Group attacks, high-profile security scares, the weaponization and balkanization of content-driven “tribes” on social media and the tsunami of ransomware attacks plaguing digital business, it’s easy to think that technology innovation has reached a zenith of unexpected and deeply negative consequences. Perhaps we should now explore the extent to which tech now undermines the basic freedoms the geeks in the HomeBrew Computer Club originally sought to foster and protect?

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

jonny_evans

Hello, and thanks for dropping in. I'm pleased to meet you. I'm Jonny Evans, and I've been writing (mainly about Apple) since 1999. These days I write my daily AppleHolic blog at Computerworld.com, where I explore Apple's growing identity in the enterprise. You can also keep up with my work at AppleMust, and follow me on Mastodon, LinkedIn and (maybe) Twitter.