Reading Time: 2 minutes

Apple is delaying the launch of its child protection features and bowing down to the criticism that the feature would allegedly threaten user privacy. The outcry was about one specific feature announced last month that would scan users’ pictures for child sexual abuse material (CSAM) – a feature scheduled to roll out by the end of this year.

 

The tech giant, in a statement to The Verge, said:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material.”

It further added that based on the critical feedback they received from researchers, customers, advocacy groups, and more, they have decided to take some more time to collect feedback and make certain improvements in the upcoming months before rolling out these extremely significant features linked to child safety. 

Apple’s actual press release about the changes had a similar statement as above and was intended to reduce the escalation of child sexual abuse material (CSAM). The press release covered three changes in its works: The first was to make changes in search so the digital assistant Siri points to resources to prevent CSAM if a user tried searching for information related to it.

 

The second included alerting the parents if their kids receive or send sexually explicit photos and immediately blur those images for the kids. And, the third change would store scanned images in a user’s iCloud Photos for CSAM and report them to the moderators of Apple, who would then continue their process and refer those reports to the National Center for Missing and Exploited Children (NCMEC).

 

The tech company further detailed the iCloud Photos scanning system to ensure that it doesn’t weaken user privacy. However, this, indeed, was a controversial change for many privacy experts. The scanning system would scan images present on a user’s iCloud Photos and compare those images from the ones present on the database of the assigned CSAM images from multiple child safety organizations, including NCMEC.

 

However, many safety and privacy experts criticized Apple for the system, stating that it could have built an on-device surveillance system and seriously violated user’s trust in Apple for protecting on-device privacy.

 

The Electronic Frontier Foundation declared in a statement on August 5th, that even if the feature is intended towards wellbeing, it will “break key promises of the messenger’s encryption itself and open the door to broader abuses.”

 

While there is quite a significant need for a feature like this, we can’t help but think, are the privacy advocates right? Will the feature really affect user privacy on iOS devices?

 

However, let’s just say that even if the tech giant is about to hurt user trust and privacy, it won’t be for the first time. Apple has previously been caught tracking user telemetry data without consent earlier this year. This is why we suggest you add an extra layer of security on your iOS device using the best VPN for iPhone until it’s confirmed that the child protection features come with no privacy threats at all.