Tech giant Apple has delayed its plans to scan its users’ iPhones and cloud storage to check child sexual abuse content. On 6 August, the company had announced that it had created automated tools, which would scan photos uploaded to its cloud storage, iCloud, and images shared over its messaging platform iMessage. The announcement was met with widespread criticism from privacy advocates, academics, politicians and others.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said in a statement to the media yesterday.
After Apple’s announcements experts had said that the new algorithms would violate its users’ privacy. “To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again,” digital rights group the Electronic Frontier Foundation, said in a blog post at the time.
The group said that Apple’s child sexual abuse material (CSAM) system will compromise privacy-first technologies like end-to-end encryption and “may appease government agencies in the US and abroad”. It also tagged the move as an “about-face for users” who have trusted the company for privacy and security so far. Messaging giant WhatsApp’s chief executive Will Cathcart had also alleged that the system could be abused by governments around the world.
In response to criticism, Apple defended its move at the time. In an FAQ document, the company said that scanning on iMessage will be done using “on-device” machine learning software, which means it does not connect to any cloud server. It also said that CSAM detection on iCloud will scan images only when they are uploaded to the cloud service, and Apple will verify the images it flags before informing the authorities.
It also said that images will only be flagged if they match with CSAM images on two or more global CSAM databases, and will refuse any government requests to make changes to these systems.
Never miss a story! Stay connected and informed with Mint.
Download
our App Now!!
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.