eXtensions - Saturday 7 August 2021



advertisement


Saturday Comment: Apple Initiative on Child Safety: Pass/Fail


By Graham K. Rogers



Cassandra



Despite the best intentions, Apple's outline of its 3-part children's safety initiative disturbed several commentators with its apparent cavalier attitude to privacy. Some early reactions may not have had full information and while Apple will need work to assure users, the long term view may be less worrisome. However, perception is all.



There are usually three ideas that politicians and law enforcement roll out as being hampered by both legislation and encryption: terrorism, organized crime and child pornography. The latter is also known as Child Sexual Abuse Materials (CSAM). All existed long before the internet was thought of although some societies tolerated what we now think of as child abuse, for example, the depiction on the Warren Cup in the British Museum. Society these days no longer accepts this behavior, although some countries do have traditions that involve children (Afghanistan, bacha bazi, and Morocco) that may still be tolerated. The Sacred Band of Thebes, is another example that could never happen these days, particularly after Don't ask, don't tell.

In the early 1970s I took a call at a police station from a concerned person. I was asked about the showing of pornographic movies: that ended with a raid and several people were arrested. The 16mm movie arrived later and the next evening was run by the officer in the case to provide evidence: I have seen the film and it is pornographic - likely to deprave and corrupt. For some odd reason, cars from all over the county were in the area on enquiries and just dropped in for a break. It was actually fairly mild but was clearly pornographic. The performers were also certainly in their late teens and 20s. With other media, such as photographs and magazines, even in the days before widespread use of electronic distribution, there was a lot going around.

The internet has perhaps expanded that - although I am not totally convinced - as it has made certain materials more widely available. I read news from many different countries each day and it is clear that several times a month the police have enough evidence to convict scores of people for possession of images of child pornography, bestiality and child cruelty. Apart from the initial contact regarding that movie, and the arrest of a pair of teenage brothers who had been stealing ladies underwear from local washing lines, my exposure to anything like that has been limited, but I would know it if I saw it.


Apple disturbed the Force at the end of the week by announcing that it will be introducing protections for children in iOS 15 that among other things marks suspect messages to children and will introduce a system to help detect CSAM on iCloud Photos. This is initially for users in USA, although (apart from China) it is not clear if iCloud data from users in other countries is stored in the USA. The uploaded images will be examined automatically to find potential CSAM images and if there are positive results, these will be handed off to humans. It sounds fine in theory, but the implications could be far reaching. The reaction was swift and negative, although some of the reactions are wrong (see Gruber, below).

I have read scores of comments on Twitter and other online sources that were critical not just of the search for CSAM but Apple's apparent cavalier attitude to user privacy, something that it has made its own fight in recent years, winning much respect for this. It now seems as if this is to be jettisoned. As Apple has previously claimed about encryption, once you open a door for the good guys it is open for everyone. Edward Snowden wrote, "No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow" (Stephen Warwick, iMore).

These CSAM checks cannot be run if Photos on iCloud is turned off. That also means these checks cannot be run on the devices (synchronization infers that a photo in iCloud is also on the device). Nor can iCloud backups be checked. Presumably these are compressed or in a specific file format, as a Time Machine backup on a disk would be. Those who are determined to store such images on their devices will simply look for new methods to do this, or seek out other ways to backup their data.

The EFF was among a number of organizations who were critical of the approach. Dorset Eye mentions several critics in its overview of Apple's announcement and a number of others who support the move. The consensus seems to be that CSAM is not good, but opening this door will lead to other demands being made and an erosion of trust, particularly in Apple and its devices. Somehow, Apple is going to have to walk this back carefully. MacDailyNews, normally one of the more partisan Apple sources, in reporting the EFF comments writes, "Apple must have been placed in an untenable situation to introduce this backdoor, destroying their vaunted claims to protecting privacy, or Tim Cook has completely lost the plot." The report enlarges on the idea of external pressure bringing this about, but the writer is not convinced, ending with "We expected Apple to be better. Apple failed."


A regular and trusted contributor to information about Apple is John Gruber (Daring Fireball), whose long and well-researched articles are essential reading that put some of the criticisms leveled at Apple in a more balanced context. He has done his homework here too and accepts that the lack of understanding has justifiably led to criticism by privacy advocates. He splits the changes into three distinct sections: Siri and Search; Messaging (only with the iCloud Family account); and CSAM detection for iCloud Photo Library. Using expert information, he explains how certain images (not all nudes) may be "fingerprinted".

The process that does this is complex and involves two layers of encryption. If an image is in the iCloud library it will go through this process which assigns a unique numerical identifier to a suspect image and does not analyze the image. A single image is not enough to trigger action when "someone at Apple will examine the contents of the safety vouchers for those flagged images before reporting the incident to law enforcement" [my italics].

Gruber notes that the whole package is grouped under a safety banner but wonders if announcing the whole package together was an error. This is a valid point as users can easily understand the need to prevent undesirable communication with children (e.g dick pics, et al). Including the image library into the same announcement has elevated it (perhaps) to a more political level: perception is all; and Apple should have been sensitive to this.

This lengthy article is worth examining carefully, particularly Gruber's comments on the potential for features to come, that others like Google or Facebook may well take up. Apple could also include end to end (E2E) encryption of the photo library itself which is currently unavailable although other E2E services are. This is speculation but makes much sense. He notes, however, that despite these inbuilt safeguards, we are "still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future."


As Friday rolled into Saturday, more and more articles could be found online, criticizing this announcement from Apple and what it may signify in the long term. There were reactions from within Apple (see links below) and there was probably extensive internal communication both before and after the plan was made public. Apple did not go into this lightly and this initiative could head off those politicians that wave the Child Pornography banner when advocating greater control.

The idea put forward by Apple may be noble but the way this has been presented - and the perceptions - will not leave Cupertino unstained after its previous championing of privacy. Whether this is a good idea is still in the air, but it all leaves a nasty taste in the mouth. Heads will roll.


Other Related Links

  • Expanded Protections for Children (Apple Statement) - includes useful links

  • Apple Will Scan Photos Stored on iPhone, iCloud for Child Abuse: Report (Jaron Schneider, PetaPixel)

  • Apple reassures employees over child protection measures in internal memo (Stephen Warwick, iMore)

  • Internal memo from software VP at Apple talks iCloud Photo scanning, maintains 'deep commitment to user privacy' (Evan Selleck, iDownloadBlog)

  • A Slippery Slope? Apple Will Soon Snoop on Your Photos (Jefferson Graham, PetaPixel)

  • Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis (Joe Rossignol, MacRumors) [useful diagram included]


    Graham K. Rogers teaches at the Faculty of Engineering, Mahidol University in Thailand. He wrote in the Bangkok Post, Database supplement on IT subjects. For the last seven years of Database he wrote a column on Apple and Macs. After 3 years writing a column in the Life supplement, he is now no longer associated with the Bangkok Post. He can be followed on Twitter (@extensions_th)


  • advertisement



    Google


    Made on Mac

    For further information, e-mail to

    Back to eXtensions
    Back to Home Page


    All content copyright © G. K. Rogers 2021