ARTICLE AD BOX
Apple has delayed plans to roll out detection technology which would have scanned US users' iPhones in search of child sexual abuse material.
It follows widespread criticism from privacy groups and others, worried that the on-device tracking set a dangerous precedent.
Apple said that it had listened to the negative feedback and was reconsidering.
There were concerns the system could be abused by authoritarian states.
The so-called NeuralHash technology would have scanned images just before they are uploaded to iCloud Photos. Then it would have matched them against known child sexual abuse material on a database maintained by the National Center for Missing and Exploited Children.
If a match was found then it would have been manually reviewed by a human and, if required, steps taken to disable a user's account and report it to law enforcement.
In a statement, Apple said: "Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.
"It was due to launch later in the year.
"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Privacy campaigners expressed concern that the technology could be expanded and used by authoritarian governments to spy on citizens.
The Electronic Frontiers Foundation said that while child exploitation was a serious problem, Apple's attempt to "build a backdoor" into its data storage and messaging systems was fraught with issues.
"To say that we are disappointed by Apple's plan is an understatement," it said at the time. It went on to gather 25,000 signatures from consumers opposed to the move.
Apple has been an exponent of privacy and end-to-end encryption in the past.