Apple's plan to scan US iPhones raises privacy flags

Apple's plan to scan US iPhones raises privacy flags
            Apple ha anunciado planes para escanear iPhones en pos de imágenes de abuso infantil, lo que produce preocupaciones inmediatas sobre la privacidad y la vigilancia del usuario con la medida.</p><h2><strong>¿Se ha transformado el iPhone de Apple en un iSpy?</strong></h2><p>Apple afirma que su sistema está automatizado, no escanea las imágenes en sí, emplea algún género de sistema de datos hash para identificar casos conocidos de pornografía infantil (CSAM) y afirma que tiene resguardas para resguardar la confidencialidad.
Privacy advocates warn that now that it has created such a system, Apple is on a rather difficult path in the face of a relentless expansion of content analysis and reporting on the device that could, and surely will be, abused. by certain countries.

What does the Apple system do

There are 3 overriding things in the system, which are going to be lurking in iOS fifteen, iPadOS fifteen, and macOS Monterey when they launch later this year.
  • Digitization of your images

  • Apple's system examines each and every image stored in iCloud Photos to see if it matches the CSAM database of the National Center for Missing & Exploited Children (NCMEC). Images are scanned into the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. In addition to this, Apple transforms this database into an unintelligible set of hashes that is stored securely on users' devices. When an image is saved to iCloud Photos, it goes through a matching process. In the event an account exceeds a threshold of multiple instances of known CSAM content, Apple is alerted. In the event of an alarm, the data is manually reviewed, the account is deactivated, and NCMEC is notified. However, the system is not perfect. The company claims there is less than a 1 in a billion chance of reporting an account incorrectly. Apple has over a billion users, which means there is a more than a 1 in XNUMX chance of a person being misidentified every year. Users who think they have been reported in error can appeal. The images are scanned into the device.
  • Analyze your messages

  • Apple's system uses on-device machine learning to examine images in messages sent or received by minors seeking sexually explicit content, alerting parents if such images are identified. Parents can turn the system on or off, and any content received by a child is going to be murky. If a child tries to send sexually explicit content, they will be notified and the parents can be notified. Apple claims that it does not have access to the images, which are scanned into the device.
  • See what you are looking for

  • The third part consists of updates to Siri and Search. Apple claims that these will now give parents and children more complete information and help if they encounter dangerous situations. Siri and Search will also intervene when people make search queries deemed to be related to CSAM, which explains why interest in this topic is conflicting. Apple kindly notifies us that its agenda is "ambitious" and that the sacrifices "will evolve and expand over time."

    Some technical data

    The company has published a detailed technical report that explains its system a bit more. In the journal, you try to assure users that you don't learn anything about images that don't match the database. Apple's technology, called NeuralHash, examines known CSAM images and transforms them into a specific unique number for each image. Only another image that looks virtually identical can generate the exact same number; For example, images that differ in size or transcoded quality will always have the same NeuralHash value. As images are added to iCloud Photos, they are matched against this database to identify a match. If a match is found, a cryptographic security ticket is created which I understand will also allow an Apple browser to decrypt and access the offending image in the event that the threshold for such content is reached and action is required. . “Apple can learn the relevant image information only once the account has more than a threshold number of CSAM matches, and even then only for matching images,” the document concludes.

    Apple is not unique, but on-device scanning can be

    Apple isn't the only one forced to share CSAM images with authorities. By law, any US company that finds such material on its servers must work with law enforcement to investigate such material. Facebook, Microsoft and Google already have technologies that examine these documents shared on email or mail platforms. The difference between these systems and this one is that the scan is done on the device, not on the company's servers. Apple has always claimed that its email platforms are end-to-end encrypted, but it becomes a bit of a semantic claim if the content of a person's device is scanned even before the encryption is done. Child welfare is, not surprisingly, something most rational people support. But what concerns privacy advocates is that certain governments may now seek to force Apple to search other documents on people's devices. A government that bans homosexuality could demand that such content be monitored as well, for example. What happens if a teen from a country that prohibits non-binary sex asks Siri for help getting out of the wardrobe? And what about smart listening devices like HomePods? It is not obvious that the research component of this system is deployed there, but it is conceivable that it is. And it's not yet known how Apple will be able to guard against such a mission leak.

    Privacy advocates are overly alarmed

    Most privacy advocates think there is a significant danger of mission drift inherent in this plan, which does nothing to support belief in Apple's commitment to user privacy. How can a user feel that her privacy is protected if the device itself is spying on them and they have no control over how? The Electronic Frontier Foundation (EFF) warns that this plan creates a security back door. All it would take to expand the narrow backdoor Apple is building is an extension of machine learning settings to find auxiliary types of content, or an adjustment of settings metrics to look at not just kid accounts, but rather those of any person. It's not a slippery slope; it is a fully built system that only waits for external pressure to effect change. “When Apple develops technology that can scan encrypted content, you can't just say, 'Well, I wonder what the Chinese government would do with this technology.' It's not theoretical,” warned John Hopkins teacher Matthew Green.

    Alternative arguments

    There are other reasonings. One of the most compelling is that the servers of ISPs and email distributors are already scanned for such content, and that Apple has built a system that minimizes human involvement and only reports an issue if it Do it. Identifies multiple matches between the CSAM database and content on the device. It is clear that the little ones are at risk. Of the nearly 6 runaways reported to NCMEC in XNUMX, one in XNUMX was likely a victim of child sex trafficking. The organization's CyberTipline (which I imagine Apple is connected to in such a case) received over XNUMX million reports related to some form of CSAM in XNUMX. John Clark, President and CEO of NCMEC, said: "With so many people using Apple products, these new safety measures have the potential to save lives for children who are lured online and whose horrific images are circulating all over the world. the planet. At the National Center for Missing and Exploited Children, we know that this crime can only be combated if we remain committed to protecting children. We can only do this because technology partners like Apple are mobilizing and making their commitment known. Others say that by creating a system to keep kids safe from such terrifying crimes, Apple is suppressing a rationale that some might use to justify device backdoors in a broader sense. Most of us agree that the little ones need to be protected, and in doing so, Apple has worn down this reasoning that certain repressive governments might use to force things. Now it must oppose any misuse of the mission by such governments. The last challenge is the biggest inconvenient, since Apple, when pressed, will always follow the laws of the governments of the countries it operates in. "No matter how well-intentioned Apple is, with it Apple is bringing mass surveillance into the world." The whole thing," warned privacy advocate Edward Snowden. If they can search CSAM today, "they can search anything tomorrow." Follow me on Twitter or join me at the AppleHolic bar & grill and Apple discussion groups on MeWe.
    <p>Copyright © dos mil veintiuno IDG Communications, Inc.</p>