Americas

  • United States

Asia

Apple backs off controversial child-safety plans

news analysis
Sep 03, 20214 mins
AppleMobilePrivacy

The company now plans to take a few months more to collect input and make improvements before releasing the features, which drew fire from privacy advocates.

Apple messages scam

In a surprise Friday announcement, Apple said it will take more time to improve its controversial child safety tools before it introduces them.

More feedback sought

The company says it plans to get more feedback and improve the system, which had three key components: iCloud photos scanning for CSAM material, on-device message scanning to protect kids, and search suggestions designed to protect children.

Ever since Apple announced the tools, it has faced a barrage of criticism from concerned individuals and rights groups from across the world. The big argument the company seemed to have a problem addressing seems to have been the potential for repressive governments to force Apple to monitor for more than CSAM.

Who watches the watchmen?

Edward Snowden, accused of leaking US intelligence and now a privacy advocate, warned on Twitter, “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

Critics said these tools could be exploited or extended to support censorship of ideas or otherwise threaten free thought. Apple’s response — that it would not extend the system — was seen as a little naïve.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” the company said.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content,” countered the Electronic Frontier Foundation.

Apple listens to its users (in a good way)

In a statement widely released to the media (on the Friday before a US holiday, when bad news is sometimes released) about the suspension, Apple said:

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

It’s a move the company had to take. In mid-August, more than 90 NGOs contacted the company in an open letter begging that it reconsider. That letter was signed by  Liberty, Big Brother Watch. ACLU, Center for Democracy & Technology, Centre for Free Expression, EFF, ISOC, Privacy International, and many more.

The devil in the details

The organizations warned of several weaknesses in the company’s proposals. One that very much cut through: that the system itself may be abused by abusive adults.

“LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk,” they wrote. “As a result of this change, iMessages will no longer provide confidentiality and privacy to those users.”

Concern that Apple’s proposed system could be extended also remain. Sharon Bradford Franklin, co-director of the CDT Security & Surveillance Project, warned that governments “will demand that Apple scan for and block images of human rights abuses, political protests, and other content that should be protected as free expression, which forms the backbone of a free and democratic society.”

Apple’s defenders said what Apple had been trying to achieve was to maintain overall privacy on user data while creating a system that could pick up only illegal content. They also pointed to the various failsafes the company built into its system.

Those arguments did not work, and Apple execs surely picked up on the same kind of social media feedback I saw, which represented deep distrust in the proposals.

What happens next?

Apple’s statement didn’t say. But given the company has spent weeks since the announcement meeting with media and concerned groups from across all its markets on this matter, it seems logical that the second iteration of its child protection tools may address some of the concerns.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

jonny_evans

Hello, and thanks for dropping in. I'm pleased to meet you. I'm Jonny Evans, and I've been writing (mainly about Apple) since 1999. These days I write my daily AppleHolic blog at Computerworld.com, where I explore Apple's growing identity in the enterprise. You can also keep up with my work at AppleMust, and follow me on Mastodon, LinkedIn and (maybe) Twitter.