Apple says it won't expand controversial CSAM technology

Apple says it won't expand controversial CSAM technology
            Apple ha tratado de desviar las críticas de su controvertido sistema de protección CSAM, pero al hacerlo, ha ilustrado exactamente lo que está en juego.</p><h2>La gran conversacion</h2><p>Apple anunció la semana pasada que introducirá un conjunto de medidas de protección infantil en iOS 15, iPad OS 15 y macOS Monterey cuando los sistemas operativos se lancen este otoño.
Among other protections, the device's system scans your iCloud Photo Library for evidence of unlawful collections of child pornography (CSAM). Protecting children is, of course, entirely appropriate, but privacy advocates remain concerned about the potential for Apple's system to become a full-fledged watchdog. In an attempt to tone down the criticism, Apple has released some new information in which it attempts to explain a bit more how the technology works. As explained in this Apple white paper, the technology converts images on your device into a digital hash that can be compared against a database of known CSAM images when you upload them to iCloud Photos.

Make it a hash

Although image analysis is performed on the device using Apple's hash technology, not all images are flagged or scanned, only those identified as CSAM. Apple claims that this is actually an improvement in that the company does not scan the entire library at any one time. “Existing techniques implemented by other companies analyze all user photos stored in the cloud. This creates a risk to the privacy of all users, ”says the company's new FAQ. "CSAM detection in iCloud Photos offers significant privacy advantages over these techniques by preventing Apple from learning photos unless they both match known CSAM images and are included in an iCloud account. Photos that include a collection of known CSAMs." Despite these assurances, there are still major concerns about the extent to which the system can be extended to monitoring other forms of content. After all, if you can turn a collection of CSAM images into identifiable data, you can turn anything into data against which personal information can be analyzed. Privacy lawyer Edward Snowden warns: “Make no mistake: if you can search for child pornography today, you can search for anything tomorrow. "

Take it with confidence?

Apple says it has no plans to bring its system to other areas. In his FAQ, he writes: “We have already faced requests to create and implement government-mandated changes that degrade user privacy, and we have firmly rejected them. We will continue to reject them in the future. Let's be clear, this technology is limited to detecting CSAM stored in iCloud and we will not agree to any government request to extend it. At first glance, this seems reassuring. But it stands to reason that now that this technology exists, countries that want to force Apple to expand device surveillance for issues beyond CSAM will use whatever weapons they have to force the issue. "All it would take to expand the narrow backdoor Apple is building is an expansion of machine learning settings to find additional types of content," the Electronic Frontier Foundation warned. Preventing this will be a struggle. Which means Apple has a fight ahead of it.

The privacy war has begun

This may be a fight Apple wants to fight. After all, we know that you have taken many important steps to protect user privacy in your ecosystems, and we also know that you support changes in the law to protect privacy online. "It is certainly time, not only for a comprehensive privacy law here in the United States, but also for global laws and new international agreements that enshrine the principles of data minimization, user knowledge, user access. Users and data security around the world,” CEO Tim Cook said this year.It could be argued that Apple's high-profile introduction of child protection measures sparked a broader conversation about rights and privacy in an online and connected world. The only way to prevent the system from spreading beyond CSAM is to help Apple resist pressure to do so.In the absence of such support, Apple is unlikely to win alone against all governments.In the event that the company is not supported, the question is when, not if, it will be forced to cave in. And yet, governments can still reach an agreement on online privacy. The stakes are high. The risk is that the bricks to along the sunny path to justice that Cook has long tried to lay can become bricks in the wall to prevent that journey from taking place. The advantage is that a determined effort can allow the creation of frameworks that allow the end of this path. The controversy reflects how bumpy this road appears to have become. Follow me on Twitter or join me on the AppleHolic bar & grill and Apple discussion groups on MeWe.
<p>Copyright © 2021 IDG Communications, Inc.</p>