Apple's Excessive Breadth of Porn: Good Intent, Bad Execution

Apple's Excessive Breadth of Porn: Good Intent, Bad Execution
            Ah, Apple.  ¿No puedes opinar sobre nada sin hacer un lío?
The latest: Apple wants to use its extended powers to combat child pornography. As usual, the company means well, wants to advance a big goal, and then uses such reach that it gives people dozens of reasons to object. Paraphrasing the old saying, the road to hell in this case begins at One Apple Park Way. Alternatively, think of Cupertino as where great ideas turn into monstrous executions. It started last week when Apple announced its intention to do something to curb child pornography and child exploitation. Good so far. Their tactic is to tell parents when their children upload nude or erotic images. Before we get into the technological aspects of all this, let's take a brief look at the almost infinite number of ways it could go wrong. (This may be where Apple's former headquarters got its name Infinity Loop.) Think of young adolescents who can explore their feelings, try to understand their desires and thoughts. And for this research to be immediately shared with their parents. Doesn't this child have the right to discuss these feelings with whoever they want, whenever they want? As others have pointed out, in some homes these children can be severely punished. This from a search on their phone to explore their minds? As a parent, I have serious doubts as to whether this is necessarily the right decision for the child. But whether it is or not, I know I don't want Apple's engineers, and certainly not Apple's algorithms, to make that decision. For more arguments about the privacy implications, here's a great open letter. Please remember that, as a matter of principle, Apple conducts business in a manner that complies with local laws and regulations. Then think about how some countries perceive these problems and let them sink in. As Apple said, the changes “will allow parents to play a more informed role in helping their children navigate online communication. The Messages app will use machine learning on the device. to warn about sensitive content…” And “as an added precaution, we can also tell the child that to make sure they are safe, their parents will receive a message if they see it”. But there is still a potentially worse problem for corporate IT, and like all bad things, it's about circumventing encryption. Let's start with the Apple announcement. Here's a longer part of the statement to provide more context: "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image with known CSAM hashes. This matching process is powered by a cryptographic technology called Private Set Intersection. Which determines if there is a match without revealing the result. The device creates a cryptographic security ticket that encrypts the match result as well as additional encrypted data in the image. This ticket is uploaded to iCloud Photos with the image. Using another technology called secret threshold sharing, the system ensures that security ticket content can only be interpreted by Apple if the iCloud Photos account crosses a known CSAM content threshold. The threshold is set to provide an extremely high level of accuracy and guarantees a less than a trillion per year chance of misreporting a given item. Only when the threshold has been exceeded does the cryptographic technology allow Apple to interpret the content of the security tickets associated with the corresponding CSAM images. Apple then manually reviews each report to confirm that there is a match, deactivates the user's account, and submits a report to NCMEC. If a user believes that an error has been reported on their account, they can appeal to have their account reinstated." Before we get to the technological issues, let's try to realistically consider how fast, easy and convenient Apple will undoubtedly make this calling process. I think it's safe to say that many of these kids will collect Social Security long before they see an appeal decision and a settled explanation. Pay close attention to this: "Using another technology called Threshold Secret Sharing, the system ensures that the content of security tickets cannot be interpreted by Apple unless the iCloud Photos account crosses a content threshold." Known CSAM. The threshold is set to provide a high level of accuracy and guarantees less than a trillion per year chance of misreporting any given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the content of the security tickets associated with the corresponding CSAM images. "Two things are going on here that should scare the crap out of any CISO or cybersecurity staff. For all you cryptographers out there, this will probably blow your mind. First, the Apple system recovers the images before they are encrypted. It doesn't go against encryption so much as it circumvents it. From a cyber thief's point of view, it's not that different. Second, consider this from the last quoted line: "Only when the threshold is crossed, does cryptographic technology allow Apple to interpret the content...". You are asking for a nightmare. While Apple's encryption controls can be opened "when the threshold is crossed", all a criminal needs to do is trick the system into thinking the threshold has been crossed. Forget the porn. It could be a good back door to view all kinds of content on this phone. The whole premise of phone cryptography is that it is as close to the absolute as possible. If you allow access to something before encryption or allow that encryption to be overridden when an algorithm concludes that certain criteria are met, then it's no longer secure. It's simply a matter of drawing a roadmap that allows attackers to access all data media. Is it a back door?
<p>Copyright © 2021 IDG Communications, Inc.</p>