Connect with us

The Plunge Daily

Apple employees believe new child safety features will tarnish brand’s reputation

Apple employees believe new child safety features will tarnish brand’s reputation
Apple employees have raised concerns about the new child safety feature set to debut with iOS in September, could tarnish the company’s reputation

Technology

Apple employees believe new child safety features will tarnish brand’s reputation

Apple employees have raised concerns about the new child safety feature set to debut with iOS in September, could tarnish the company’s reputation as a bastion of user privacy. Apple’s suite of child protection tools includes on-device processes designed to protect and report child sexual abuse material uploaded to iCloud Photos.




According to Reuters, Apple employees have taken to Slack to express their worries. The employees believe that the feature could be exploited by repressive governments looking to find other material for censorship or arrests. The pushback marks a shift for a company where a strict code of secrecy around new products colors other aspects of the corporate culture.

The report highlights that core security employees did not appear to be major complainants in the posts, and some of them said that they though Apple’s solution was a reasonable response to pressure to crack down on illegal material. Other employees said they hoped that the scanning is a step toward fully encrypting iCloud for customers who want it, which would reverse Apple’s direction on the issue a second time.

Apple is facing criticism from privacy advocates who say the child safety protocols raise a number of red flags. While some of the pushback can be written off to misinformation stemming from a basic misunderstanding of Apple’s CSAM technology, others raise legitimate concerns of mission creep and violations of user privacy that were not initially addressed by the company.

Emma Llanso, Centre for Democracy and Technology (CDT), in an interview said that what Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in. “It seems so out of step from everything that they had previously been saying and doing.”


Also Read: Digital space needs to be regulated: Ajay Devgn


Apple has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material. However, critics say a fundamental problem with Apple’s new plan on scanning child abuse images is that the company is making cautious policy decisions that it can be forced to change, now that the capability is there, in exactly the same way it warned would happen if it broke into the terrorism suspect’s phone.

On its part, Apple said it will scan only in the US and other countries will be added one by one, only when images are set to be uploaded to iCloud, and only for images that have been identified by the National Center for Exploited and Missing Children and a small number of other groups.


Click to comment

Leave a Reply

Your email address will not be published.

To Top
Loading...