Apple will start locally checking photos on the iPhone for child abuse-FT

First, the function will be introduced in the United States, sources say.

The fact that Apple plans to implement the option of scanning local media libraries of iPhone users was reported by the Financial Times and cryptography specialist Matthew Green.
The neuralMatch function will locally scan users ‘ photos on the iPhone without transferring data to the cloud storage. If the system detects that the owner of the iPhone is storing illegal images, it will warn Apple employees.

Apple specialists trained the neuralMatch neural network on 200,000 images provided by the US National Center for Missing and Exploited Children. The function will compare users ‘ photos with a database of snapshots using hashing algorithms.
Each snapshot uploaded to iCloud will receive a label that will indicate whether the frame is suspicious or not. As soon as a certain number of photos are marked as suspicious, Apple will decrypt such images and contact the authorities if necessary.
Matthew Green, a professor at Johns Hopkins University and a cryptographer, wrote that several of his sources confirmed Apple’s plans at once. According to him, the tool can help with the search for illegal images, but if it falls into the hands of scammers, they can use it to their advantage. In addition, he stressed that hashing algorithms can give false results.
Apple already scans photos in iCloud for child abuse, which the company said in January 2020, without specifying what technology it uses. Many companies, including Facebook, Twitter and Google, use the PhotoDNA system for this purpose, which checks images against a database of previously recognized images using hashing technology.
Apple may reveal details about the neuralMatch technology in the coming days, sources told the Financial Times.
Earlier, iPhone users discovered an unannounced feature in the beta version of iOS 15 that automatically removes glare from photos after shooting.

Related Posts

Leave a Reply

Your email address will not be published.