+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

If you live in the US, Apple reportedly plans to scan your iPhone for child sexual abuse images

Aug 6, 2021, 00:01 IST
Business Insider
Apple will install software on American iPhones that will look for child abuse imagery, the Financial Times reported. MLADEN ANTONOV/Getty Images
Advertisement

Apple is reportedly planning to roll out software that will scan US iPhone photos for images of child sexual abuse, the Financial Times reported on Thursday.

Apple could announce more about the software in the coming week, according to the report, which cited security researchers familiar with Apple's plans.

The software, reportedly called neuralMatch, is designed to look through images that have been stored on iPhones and uploaded to iCloud storage. According to the Financial Times, if the software detects child sexual abuse in a photo, it will then pass the material on to human reviewers who will alert law enforcement if they think the images are illegal.

However, security experts warned that this could snowball beyond looking for child sexual abuse images.

"Whether they turn out to be right or wrong on that point hardly matters. This will break the dam - governments will demand it from everyone," Matthew Green, a cryptographer at Johns Hopkins University, said on Twitter.

Advertisement

An Apple spokesperson did not immediately respond to Insider's request for comment, and the company declined to comment to the Financial Times.

Apple makes privacy a selling point, at times frustrating law enforcement

This new software, if implemented, would likely please law enforcement and government agencies, but risks potential backlash from privacy activists. Apple has made privacy features a cornerstone of its marketing in recent years, advertising that "what happens on your iPhone stays on your iPhone."

But there are limits to this promise, and tradeoffs. Apple already monitors images sent from Apple devices for child abuse imagery, using a technique called "hashing," and alerts law enforcement when the algorithm and an Apple employee detect suspected child abuse material. It also cooperates with law enforcement on lawful requests for information.

"Our legal team reviews requests to ensure that the requests have a valid legal basis," Apple writes on its website. "If they do, we comply by providing data responsive to the request. If a request does not have a valid legal basis, or if we consider it to be unclear, inappropriate, or overly broad, we challenge or reject the request. We report on the requests every six months."

In the past, Apple has resisted government agencies' requests for the company to install a back door that would allow law enforcement to access encrypted messages. New York City police and prosecutors have criticized Apple's encryption technology for aiding criminals in hiding information from law enforcement.

Advertisement

Other tech companies like Facebook have also been caught between protecting users' privacy and requests from law enforcement and government agencies. Government officials in multiple companies have criticized Facebook's encryption of its Messenger service for making it more difficult to detect content depicting child sexual exploitation.

Researchers told the Financial Times that Apple's decision could pressure other companies into implementing similar kinds of monitoring and could later expand into monitoring of images beyond child sexual abuse, like anti-government signs held at protests.

Read more about Apple's reported plans for the software over at the Financial Times.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article