Affiliate links on Android Authority may earn us a commission. Learn more.
Apple delays rollout of controversial photo-scanning security feature
- The Apple CSAM photo-scanning feature announced a month ago will be delayed.
- Apple says it needs “to take additional time over the coming months” to polish the feature.
- The policy would see user photos algorithmically “scanned” for evidence of child abuse.
At the beginning of August, Apple announced a very controversial new policy. In an effort to curb child exploitation, the company said it would start securely scanning every single photo people upload to iCloud. Although this scanning would be done algorithmically, any flags from the algorithm would see a follow-up from a human.
Obviously, Child Sexual Abuse Material (CSAM) is a huge problem that almost everyone wants to fight. However, the Apple CSAM policy made plenty of people uneasy due to how privacy-invasive it seems. Now, the company is delaying the rollout of the feature (via 9to5Mac).
Apple promised that its algorithm for scanning user material was incredibly accurate, claiming there is a “one in one trillion chance per year of incorrectly flagging a given account.” That promise didn’t stop the unease, though. Apple’s statement on the delay makes that quite clear:
Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
The statement suggests Apple won’t be rolling this out anytime soon. “Over the coming months” could mean the end of this year or possibly into 2022. It could even be delayed indefinitely.
Interestingly, a recent poll from SellCell showed that very few Android users are planning to switch to this year’s iPhone. When asked why, a major portion of the responses cited the Apple CSAM policy as a factor.