eXtensions - Monday 16 August 2021



advertisement


Apple Bets the Family Silver


By Graham K. Rogers



Family silver



Apple has been trying hard to minimize the damage from the announcement of CSAM image detection. It now admits staggering the release might have been better. In an effort at damage control Craig Federighi was interviewed by Joanna Stern (WSJ) but many still have questions about future privacy concerns.



Every once in a while, a story appears that will not go away. For example, in 2011 it was found that Apple was tracking users with what it called predictive locations. It was possible to examine the files on a Mac if the iPhone was backed up. I looked at this in April that year and found locations marked for a trip to the north of Thailand for places I had been and some places within 50Kms: closer to the border with Myanmar. Those that I might visit.


locations


The data was shared with service providers to help with advertising. Apple made changes and those map files could no longer be accessed on the Mac, but this lingered online for a long while. There is also a good overview by Jacque Cheng (ArsTechnica) available.

The last 10 days have seen Apple trying to walk back some of the criticisms leveled at it over its CSAM detection system particularly by privacy advocates. As was evident early in the online comments, some missteps had clearly been made. For example, last week John Gruber (Daring Fireball) wondered why the three announcements had been made together. This made the situation appear worse than it perhaps was. Apple now seems to agree with this assessment.

There was no real lessening of the criticisms online, but significantly the week ended with some big guns and a further document release from Apple which falls somewhere between the initial technical outline and the simpler FAQ. The three documents themselves are significant. Although later articles have been more measured with some comparing how others deal with similar detection of CSAM images, as well as the pressures from regulators, there is still the sense that for a general good (child protection) Apple may have opened a door that could be used for political purposes in the future. While Apple denies this possibility, a number of informed commentators insist that the door is open, while even writers normally positive towards Apple are less sure.


On Wednesday I looked at comments from Apple's Privacy Chief, Erik Neuenschwander, who tried to explain the built-in privacy protections with the new system (Hartley Charlton, MacRumors, et al). He claimed that Apple now has the "technology that can balance strong child safety and user privacy" adding that the device is still encrypted. Asked about foreign government pressure, he said that the system only works with the hashtags that matches images in a single database in the USA, specifically used for identifying child abuse images. This does not work if iCloud Photos is not in use. In recent days it has been reported that several users have expressed the intention to downgrade their accounts to 5GB and remove photos (Stephen Warwick, iMore).

One of the normally partisan commentators on Apple, Jason Snell, MacWorld, expressed his discomfort, not with the CSAM image scanning, but with the way this has all been rolled out, noting that Apple, "doesn't seem to have anticipated all the pushback its announcement received". Explaining the way the technology is to be used, Snell's comments are easy to absorb, but he notes that there appears to be a missing part: "there's another shoe to drop here, one that will allow Apple to make its cloud services more secure and private". Commenting further on the implementation, he notes that the discomfort that others are feeling with this is valid. Although this tool has a specific design, there is always that, "what if?" sometime in the future.


Slate tends to be more neutral when dealing with Apple so the comments from Jake Dean and the support he brings in are worth considering, especially when the fears concerning expansion of the iCloud images potential are expanded to include messaging: expressly mentioning dissidents and families with gay kids. Both of these groups might use messaging to discuss problems.

Dean explains the different parts of the image identification systems clearly, but when examining the iCloud Photos aspect and the use of hashes outlines the fear that many have that Apple may not be able to stop the expansion to other areas as it has claimed: refusing to do something with certain governments may not be enough. We have seen the way that China has forced Apple (and Google) to make changes; and recently Apple had to ensure specific apps were on the iPhones sold in Russia.


On Friday, a report from Reuters showed that there was also dissent within Apple. There has been increased activism within the corporation in recent months with a number of leaks about ongoing situations (hiring of Antonio Garcia Martinez; adjustment of working from home conditions).

Now the staff are unhappy with the CSAM image implementation, particularly that the "feature could be exploited by repressive governments looking to find other material for censorship or arrests". Other expressed opinions match some of the comments from the outside reports about the worry that Apple has opened a door and it cannot be shut.


It was clear from a number of reports and from Apple's reactions, including the FAQ and the follow-up document, that there was considerable surprise within Cupertino that the announcements concerning CSAM detection went down like a lead Zeppelin. Ben Lovejoy (9to5Mac) opens with this point: "It took Apple completely by surprise. Which is a surprise" [my italics]. Most of what follows is similar to what had already been written, summarizing the ideas, but then Lovejoy comes to a section: Blindness to its own brand image.

The paragraph ends with, "I am frankly stunned that Apple didn't understand that any reduction of privacy, however minor it may be, and however good the reason, was going to create massive waves." He looks further at the missteps and notes, like others, that the announcement was severely mishandled. Also like others he puts forward the question about whether this could be related to a future announcement on encryption. If so, he comments, this was done in the wrong order.

Lovejoy had amended his original content to reflect the change in the Force that occurred when an interview by Joanna Stern with Craig Federighi on the confusion, was put out by The Wall Street Journal (my original link - Benjamin Mayo, 9to5Mac).. This is a recognition by Apple that much more work needs to be done by the Corporation to assuage the fears of users (and critics) over the announcement. By rolling out Federighi who has a good public image, it is clear that Apple knows it has stumbled. He acknowledges this by "we wish that this had come out a little more clearly, because we feel very positively and strongly about what we are doing, and we can see that it has been widely misunderstood".


A link to the video interview with Joanna Stern, which begins with this mea culpa is accessible through YouTube. The video is just under 12 minutes long. A good point early in the video is that other cloud services seek out these images currently by looking at the images, but Apple wanted to scan without looking at the photos. From information I have seen elsewhere, such visual examination puts considerable stress on those employed to sift through the thousands of images and other disturbing content that are online (e.g. Casey Newton, The Verge). At some time in the near future, Apple and others may be forced (by legislation) to look for any such images and this non-intrusive method seems to be a reasonable solution.

An interesting point was the threshold that Apple intends to use before there is human intervention. The original technical document had hypothesized that the algorithm would elevate the examination with 10 images, and other speculation suggested 5, but Federighi states that the threshold is 30 images. To me that clears any possible suggestion that this was an accidental download: with 30 images or more, the user is a collector. Stern points out that Facebook, Google and Microsoft already access the NCMEC image database and scan the images, but Apple wants to do this using software for privacy reasons.

When Stern tackles Federighi about "coming on to my device" he responds that this is a common misunderstanding and explains that what happens is part of a complex process with separate actions in "the pipeline" from Library to iCloud. She then asks if this is a back door, repeating the fears that so many have about this, but Federrighi is adamant: in no way is this a back door, adding that he really does not understand this characterization. This sounds almost naive when he says it with some exasperation. However, he insists that the question that many have about a future request from a government to identify other messages or images, is rejected because the multiple levels in the process pipeline protect the user (and the device) from any such incursions.


While the previous weekend had been filled with new items and comments on the CSAM rollout, this weekend, following the Federighi interview was quiet, but I do not think that Apple has slain this dragon just yet.


Graham K. Rogers teaches at the Faculty of Engineering, Mahidol University in Thailand. He wrote in the Bangkok Post, Database supplement on IT subjects. For the last seven years of Database he wrote a column on Apple and Macs. After 3 years writing a column in the Life supplement, he is now no longer associated with the Bangkok Post. He can be followed on Twitter (@extensions_th)


advertisement



Google


Made on Mac

For further information, e-mail to

Back to eXtensions
Back to Home Page


All content copyright © G. K. Rogers 2021