Apple will check pictures of iCloud users for child abuse
CES 2020 marked the unofficial return of Apple to the show, for the first time since former CEO John Scully made his debut with Newton's personal digital assistant in 1992. However, the technology giant came to Las Vegas not to demonstrate technological advances, but to draw community attention to the issue of personal data security.
The control process is carried out in automatic mode and is not scanning images in the traditional form. The technique for detecting prohibited content is to identify special digital hash signatures that allow you to identify digital fingerprints that indicate the content of pornographic scenes in the pictures. Thus, the privacy of users will be preserved and their pictures will not be subjected to total viewing.
During the presentation of the new privacy conditions, it turned out that Apple’s new methodology is similar to Microsoft PhotoDNA technology, which Facebook, Twitter and Google already use to prevent the dissemination of child pornography content. However, Apple did not mention the PhotoDNA technology.
Meanwhile, Microsoft continues to develop a system warning about prohibited content. The new free system created by Project Artemis does not control images, but user conversations to identify violations related to the sexual exploitation of children.
As part of Apple’s new terms of privacy, it was also announced that if it finds illegal content, it will be automatically deleted from iCloud. It is noteworthy that in 2014, Google informed law enforcement authorities that some Gmail users had illegal content.
Source: nakedsecurity. [TagsToTranslate] Apple