For centuries, cryptography was the exclusive preserve of the state. Then, in 1976, Whitfield Diffie and Martin Hellman came up with a practical method for establishing a shared secret key over an authenticated (but not confidential) communications channel without using a prior shared secret. The following year, three MIT scholars – Ron Rivest, Adi Shamir and Leonard Adleman – came up with the RSA algorithm (named after their initials) for implementing it. It was the beginning of public-key cryptography – at least in the public domain.
From the very beginning, state authorities were not amused by this development. They were even less amused when in 1991 Phil Zimmermann created Pretty Good Privacy (PGP) software for signing, encrypting and decrypting texts, emails, files and other things. PGP raised the spectre of ordinary citizens – or at any rate the more geeky of them – being able to wrap their electronic communications in an envelope that not even the most powerful state could open. In fact, the US government was so enraged by Zimmermann’s work that it defined PGP as a munition, which meant that it was a crime to export it to Warsaw Pact countries. (The cold war was still relatively hot then.)
In the four decades since then, there’s been a conflict between the desire of citizens to have communications that are unreadable by state and other agencies and the desire of those agencies to be able to read them. The aftermath of 9/11, which gave states carte blanche to snoop on everything people did online, and the explosion in online communication via the internet and (since 2007) smartphones, has intensified the conflict. During the Clinton years, US authorities tried (and failed) to ensure that all electronic devices should have a secret backdoor, while the Snowden revelations in 2013 put pressure on internet companies to offer end-to-end encryption for their users’ communications that would make them unreadable by either security services or the tech companies themselves. The result was a kind of standoff: between tech companies facilitating unreadable communications and law enforcement and security agencies unable to access evidence to which they had a legitimate entitlement.
In August, Apple opened a chink in the industry’s armour, announcing that it would be adding new features to its iOS operating system that were designed to combat child sexual exploitation and the distribution of abuse imagery. The most controversial measure scans photos on an iPhone, compares them with a database of known child sexual abuse material (CSAM) and notifies Apple if a match is found. The technology is known as client-side scanning or CSS.
Powerful forces in government and the tech industry are now lobbying hard for CSS to become mandatory on all smartphones. Their argument is that instead of weakening encryption or providing law enforcement with backdoor keys, CSS would enable on-device analysis of data in the clear (ie before it becomes encrypted by an app such as WhatsApp or iMessage). If targeted information were detected, its existence and, potentially, its source would be revealed to the agencies; otherwise, little or no information would leave the client device.
CSS evangelists claim that it’s a win-win proposition: providing a solution to the encryption v public safety debate by offering privacy (unimpeded end-to-end encryption) and the ability to successfully investigate serious crime. What’s not to like? Plenty, says an academic paper by some of the world’s leading computer security experts published last week.
The drive behind the CSS lobbying is that the scanning software be installed on all smartphones rather than installed covertly on the devices of suspects or by court order on those of ex-offenders. Such universal deployment would threaten the security of law-abiding citizens as well as lawbreakers. And even though CSS still allows end-to-end encryption, this is moot if the message has already been scanned for targeted content before it was dispatched. Similarly, while Apple’s implementation of the technology simply scans for images, it doesn’t take much to imagine political regimes scanning text for names, memes, political views and so on.
In reality, CSS is a technology for what in the security world is called “bulk interception”. Because it would give government agencies access to private content, it should really be treated like wiretapping and regulated accordingly. And in jurisdictions where bulk interception is already prohibited, bulk CSS should be prohibited as well.
In the longer view of the evolution of digital technology, though, CSS is just the latest step in the inexorable intrusion of surveillance devices into our lives. The trend that started with reading our emails, moved on to logging our searches and our browsing clickstreams, mining our online activity to create profiles for targeting advertising at us and using facial recognition to allow us into our offices now continues by breaching the home with “smart” devices relaying everything back to motherships in the “cloud” and, if CSS were to be sanctioned, penetrating right into our pockets, purses and handbags. That leaves only one remaining barrier: the human skull. But, rest assured, Elon Musk undoubtedly has a plan for that too.
Wheels within wheels
I’m not an indoor cyclist but if I were, The Counterintuitive Mechanics of Peloton Addiction, a confessional blogpost by Anne Helen Petersen, might give me pause.
Get out of here
The Last Days of Intervention is a long and thoughtful essay in Foreign Affairs by Rory Stewart, one of the few British politicians who always talked sense about Afghanistan.
Blowing the Whistle on Facebook Is Just the First Step is a bracing piece by Maria Farrell in the Conversationalist about the Facebook whistleblower.