The company last month announced features aimed at flagging child sexual abuse images that users store on its iCloud servers. Apple did not say how long it will delay the program.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” according to a company statement.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
The system was built to look for images that match those from libraries assembled by law enforcement to find and track the dissemination of child abuse material on the internet.
Some child safety advocates were disappointment by Apple’s announcement.
â€œWe absolutely value privacy and want to avoid mass surveillance in any form from government, but to fail children and fail the survivors of child sexual abuse by saying weâ€™re not going to look for known rape videos and images of children because of some extreme future that may never happen just seems wholly wrong to me,â€ said Glenn Pounder, chief operating officer of Child Rescue Coalition, a nonprofit that develops software to help law enforcement identify people downloading child sexual abuse material.
â€œIt means that there are dangerous criminals who will never be reported to NCMEC (the National Center for Missing & Exploited Children) for investigation by law enforcement because Apple has now rolled back on what they said they were going to do.â€