BELOW SUPERNAV drop zone ⇩

Apple delays iPhone photo-scanning plan amid fierce backlash

FILE – In this Wednesday, Dec. 16, 2020 file photo, the logo of Apple is illuminated at a store in the city center in Munich, Germany. Apple said Friday, Sept. 3, 2021 it’s delaying its plan to scan U.S. iPhones for images of child sexual abuse, saying it needs more time to refine the system before releasing it. The company had revealed last month that it was working on a tool to detect known images of child sexual abuse, which would work by scanning files before they’re uploaded to iCloud. (AP Photo/Matthias Schrader, file)

MAIN AREA TOP drop zone ⇩

AUTO TEST CUSTOM HTML 20241114185800

This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

BERKELEY, Calif. (AP) — Apple is indefinitely delaying plans to scan iPhones in the U.S. for images of child sexual abuse following an outcry from security and privacy experts who warned the technology could be exploited for other surveillance purposes by hackers and intrusive governments.

The postponement announced Fridaycomes a month after Apple revealed it was getting ready to roll out a tool to detect known images of child sexual abuse. The tool would work by scanning files before they’re uploaded to its iCloud back-up storage system. It had also planned to introduce a separate tool to scan users’ encrypted messages for sexually explicit content.

Apple insisted its technology had been developed in a way that would protect the privacy of iPhone owners in the U.S. But the Cupertino, California, company was swamped criticism from security experts, human rights groups and customers worried that the scanning technology would open a peephole exposing personal and sensitive information.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in an update posted above its original photo-scanning plans.

Apple never set a specific date for when the scanning technology would roll out, beyond saying it would occur some time this year. The company is expected to unveil its next iPhone later this month, but it’s unclear if it will use that event to further discuss its change in plans for scanning the devices in the U.S.

The intense backlash to the scanning technology was particularly bruising for a company that has made personal privacy a marketing mantra. Apple contends it is more trustworthy than other major technology companies such as Google and Facebook that vacuum up information about people’s interests and location to help sell digital ads. Apple CEO Tim Cook is known to repeat the catchphrase “Privacy is a fundamental human right.”

The photo scanning technology was “a really big about-face for Apple,” said Cindy Cohn, executive director for the Electronic Frontier Foundation, one of the most vocal critics of the company’s plans. “If you are going to take a stand for people’s privacy, you can’t be scanning their phones.”

Cohn applauded Apple for taking more time to reassess its plans and urged the company to talk to a broader range of experts than it apparently did while drawing up its scanning blueprint in its typically secretive fashion.

Matthew Green, a top cryptography researcher at Johns Hopkins University and another outspoken critic of Apple, also supported the delay. He suggested the company talk to technical and policy communities and the general public before making such a big change that threatens the privacy of everyone’s photo library.

“You need to build support before you launch something like this,” Green said. “This was a big escalation from scanning almost nothing to scanning private files.”

When Apple announced the scanning technology last month, Green warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement.

Not long after Green and privacy advocates sounded warnings, a developer claimed to have found a way to reverse-engineer the matching tool, which works by recognizing the mathematical “fingerprints” that represent an image.

Apple traditionally has rejected government demands for data and access to devices that it believes are fishing expeditions or risk compromising the security of its customers or devices.

In a highly publicized act of defiance, Apple resisted an FBI demand in 2016 that the company crack the code protecting an iPhone used by one of the killers during a mass shooting in San Bernardino, California. It argued at the time that it would be opening a digital backdoor that could be exploited by hackers and other unauthorized parties to break into devices. In that instance, Apple was widely praised by civil rights and privacy groups.

——

O’Brien reported from Providence, Rhode Island. AP Business Writer Kelvin Chan contributed to this story from London.

Tech Headlines

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed AP

Site Settings Survey

 

MAIN AREA MIDDLE drop zone ⇩

Trending on NewsNation