Apple will strengthen the protection of children — it will begin to blur candid pictures in “Messages”, check iCloud and warn in Siri

The company will check the content directly on the devices using neural networks trained to recognize sexual images.

Apple has announced the expansion of child protection features on all its operating systems — iOS, iPadOS, macOS and watchOS. The company will begin to blur explicit pictures in “Messages”, warn parents about receiving and sending sexual photos, scan content in iCloud and add security guides to Siri.
The company explained that it wants to protect children from ” predators who use the means of communication to attract and exploit.” To do this, Apple is introducing new options in three areas: “Messages”, iCloud, as well as in Siri — the capabilities were developed together with experts on child safety.
Blurring explicit pictures and warning parents in “Messages”
With the release of iOS and iPadOS 15, watchOS 8, macOS Monterey, the Messages app will start using machine learning on each specific device to recognize explicit content. The algorithms will scan the photos locally, but Apple itself will not get access to the data — everything will happen on the devices.
If you receive explicit content, “Messages” will automatically blur the picture, and also tell children that they may not look at such photos. Children will also not be able to send sexual photos — they will be warned, and parents will be informed if the picture is still sent or viewed.
Recognizing photos in iCloud and transmitting data about child sexual abuse to the police
The new iOS and iPadOS will also begin to fight the spread of content with child sexual Abuse Material (CSAM). Apple will recognize explicit photos in iCloud and transfer the data to the authorities. In the US, the technology will allow the company to report dangerous cases to the National Center for Missing and Exploited Children (NCMEC). The organization will work together with local authorities in different states.
Scanning images in iCloud will also take place privately: for this, Apple will use machine learning on devices. Neural networks will start matching images from cloud storage with hashes of images provided by NCMEC and other child safety organizations. Apple claims that this database will be an unreadable set of hashes that are safely stored on devices.
Apple will also use the “secret sharing” technology, which will protect security vouchers from decryption on the server side as long as the user has not crossed the security rules. The company claims that this guarantees a high level of accuracy — the system will produce less than one in a trillion false results per year.
Only if violations are detected by two technologies at once, Apple will be able to interpret the content of security vouchers and compare the images independently. Then the company promises to manually study each case to confirm compliance, disable the user’s account and report it to NCMEC. Users will be able to challenge the complaint if they consider it erroneous.
New security guides in Siri and tips in the “Search”
Apple will also begin to better warn and educate young system users and their parents about how to behave better in unsafe situations. Siri and Search will intervene when users try to find CSAM-related data — the system will explain that these are problematic and painful topics and offer to read more about them.
Siri will also start offering additional information to children and parents on how to respond to child exploitation and sexual abuse. For example, if users ask the voice assistant how to report CSAM, they will be directed to the appropriate resources.
Apple promises to expand its resources to protect children over time. New features will become available with the release of new versions of the systems in the fall of 2021. The company also provided additional information about innovations in detailed documentation, including a squeeze on the use of technologies, security protocols, and independent technical expert advice.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *