Page Nav

HIDE

Grid

GRID_STYLE

Breaking News

latest

Apple Will Bring A Tool To Scan Your iPhone Photos

iPhone will add new tools to warn children and their parents when receiving or sending sexually explicit photos / iPhone. Apple is introduci...

iPhone will add new tools to warn children and their parents when receiving or sending sexually explicit photos / iPhone.
Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos. Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. 

Siri and Search will also intervene when users try to search for CSAM-related topics. These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.* This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.

Apple is will scan your iPhone photos for child sexual abuse material (CSAM), including the media content related to child pornography. The new development, which is expected will be announced soon, would be implemented on the client side — on the user's device — to look for specific perceptual hashes and send them directly to Apple servers if they appear in a large quality. The idea is that by carrying out the checks on the user's device, it protects their privacy though it is not clear whether this system could be misused in some way.

Cybersecurity expert Matthew Daniel Green, who works as an Associate Professor at the Johns Hopkins Information Security Institute in the US, tweeted about Apple's plans to launch the client-side system to detect child abuse images from the iPhone. He said that the under-developing tool could eventually be a “key ingredient” in adding surveillance to encrypted messaging systems.

“The way Apple is doing this launch, they're going to start with non-E2E [non-end-to-end] photos that people have already shared with the cloud. So it doesn't ‘hurt' anyone's privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn't the goal,” Green said in a detailed thread on Twitter.

Apple may raise user concerns through its new tool as even if there would be enough layers to protect misuse, it may turn up false positives. Governments may also be able to abuse the system to go beyond looking for illegal child content and search for media that could push public attitudes toward political engagements.

Science Techniz has reached out to Apple for a comment on the development of the reported tool and will update this space when the company responds. In the past, Apple was found to have deployed similar hashing techniques to look for child abuse content in emails of its iPhone users. The Cupertino company was also last year reported to have dropped encrypted backups on its iCloud to silently provide a backdoor entry to law enforcement and intelligence agencies.

However, the new move seems to be done keeping privacy in mind as it will be deployed on the user's device without needing to send images to the cloud. The exact scope of the tool is yet to be determined as Apple has not yet specified any official details, but Green tweeted that an announcement could take place this week.