Apple has postponed the launch of the iCloud photo verification function for child pornography after user criticism

The company will spend additional time collecting information and improving the function.

“After studying the reviews of clients, human rights activists, researchers and others, we decided to spend more time collecting information and making improvements in the coming months,” the company’s message leads CNBC.
On August 6, Apple announced that it will expand child protection features on its devices. The company has implemented functions in three areas — “Messages”, photos in iCloud and in Siri.
Starting in the fall of 2021, the systems were supposed to start blurring explicit pictures on the devices of children under 12 in iMessage, checking photos in the iCloud cloud for child porn and giving safety tips to children and parents in Siri.
After the announcement, the company was criticized by cybersecurity experts, journalists and ordinary users-they accused Apple of violating privacy.
After the criticism, Apple answered frequently asked questions about checking iCloud photos for child pornography. The company promised not to use the system for surveillance, and also clarified that the function is currently being launched only in the United States.

Related Posts

Leave a Reply

Your email address will not be published.