As the more subtle attempts at undermining security become impossible, spies will have to find alternative routes to access their targets. Earlier this year the UK government published the legal framework under which GCHQ and other British spies can hack, use bugging devices (or even steal and replace) computers, servers, routers, laptops, and mobile phones to either obtain information or conduct surveillance.
The guidelines create a legal framework for such behaviour under UK law, and even okays potential intelligence gathering activities which involved hacking attempts against people who are themselves not targets of intelligence agencies.
This gives some credence to Snowden’s recent claim that intelligence agencies are targeting IT staff because they have access to systems and databases.
It’s also worth noting that, despite the anguished howls from law enforcement, spy agencies and others still have plenty of data left to sift.
Firstly, encryption is really, really hard to get right: as projects like Bullrun and others have proved, the intelligence agencies and law enforcement still have plenty of ways around it. There are legal tools, for example: the UK has legislation in place which makes it an offence to not hand over encryption keys when requested by law enforcement, punishable by up to five years in prison.
And while many tech companies may well encrypt customers’ data when it is on the move — such as between datacentres — many will not secure it entirely using end-to-end encryption.
Why? Simply because they need to look at that your email or web browsing themselves in order to sell advertising against the subject matter of the email.
The advertising-driven business models of Silicon Valley rule out the pervasive use of strong end-to-end encryption, and that means intelligence agencies and police can continue to gain access to vast amounts of information.
Police and intelligence agencies still have plenty of other data sources — the metadata on communications, including who you have called, when, and for how long, CCTV, and more.
“Law enforcement agencies have access to more data now than they have had in the history of time. Pre-Facebook, how hard would it be for any law enforcement agency on the planet to find out all your known associates? They’d have to question dozens of people to find out who it is you know. They are able to get access to vast amounts of information just by asking,” said Privacy International’s Hosein.
“They complain that they’re not getting enough information but they’ve had more than they’ve ever had before,” he added.
Edinburgh Napier University’s Buchanan echoes the sentiment: “There are now so many ways that investigators can actually investigate someone who is suspected of committing a crime there isn’t really a problem. This isn’t going to shut the door.” Good old-fashioned policing and follow-the-money are still the most effective ways of catching the bad guys.
And widespread usage of strong encryption is not the worst scenario for the spies: harder to crack and harder to detect technologies are already either in existence or in development.
One such technology is steganography — hiding communications within digital images — and it’s incredibly hard to spot. Equally, quantum encryption could do away with the inherent weakness of the public key infrastructure systems used today and make messages impossible to intercept.
Still, even the experts don’t really know how the future of encryption is going to play out: there is apparently no way of accommodating both the desire of the intelligence agencies to be able to access the data they want with the safe and secure working of the web as we know it.
They are mutually exclusive, and mutually antagonistic. Like the best encryption, the problem of making national security and privacy work together seems uncrackable.
“Many of us agree with the sentiment — I am one of them — that from a security perspective you don’t want people who would do you harm being able to talk in secret. But at the same time if your answer to that is to ban encryption, that is a very bad way; the technology is not good or evil, it is the people using it,” said the University of Surrey’s Woodward.
“If we can’t secure these things, then people will die.”
Technology is unlikely to offer a way out of this impasse. As the power of supercomputers (or more likely giant cloud arrays) continues to grow, it’s easy enough to increase the size of the key — from 516, to 1024, to 2048 and onwards.
Even quantum computers, long touted as a way of cracking all encryption almost immediately, become widespread the reality is that, although they would undermine encryption in one way, they will also boost it again (thanks to something called quantum key distribution). And as Woodward notes “we’ve been talking about viable quantum computers since the 80s and they’re always 10 years away.”
But the stakes may continue to rise, as least from a certain point of view.
“The security of our common computing infrastructure is even more important now than it was back then. Back in the 1990s, the reason we won was because every economy wanted to be the best marketplace for ecommerce on the planet so they knew they could not put constraints on security technology if they wanted to enable all that ecommerce,” said Privacy International’s Hosein.
And soon those issues of privacy and security will become as concrete as the buildings we live in. With the advent of smart grids, the internet of things and smart cities, we will be using the web to monitor and control real-world systems. “If we can’t secure these things, then people will die,” he warns.
This also raises another issue: as our houses and even clothes are filled with sensors, what sort of privacy is appropriate? Is it right that we should be snooped on through our smart TV or networked baby monitor, or our webcams or smartwatches? Can we draw a line anywhere?
When President Obama was asked about the issue of encryption his response was nuanced. While he said he supported strong encryption he also noted: “The first time an attack takes place and it turns out that we had a lead and we couldn’t follow up on it, the public is going to demand answers, and so this is a public conversation that we should end up having.”
It’s entirely possible to argue that we don’t need another public debate about encryption: that we had one back in the 1990s. And that privacy had trumped national security when it came to the use of strong encryption. It’s just that the intelligence services didn’t like the answer.
But there are plenty of good reasons why we do need to go over the arguments about encryption again.
“This is a public conversation that we should end up having.”
Back in the 1990s and 2000s, encryption was a complicated, minority interest. Now it is becoming easy and mainstream, not just for authenticating transactions but for encrypting data and communications.
Back then, it was also mostly a US debate because that was where most strong encryption was developed. But that’s no longer the case: encryption software can be written anywhere and by anyone, which means no one country cannot dictate global policy anymore.
Consider this: the right to privacy has long been considered a qualified rather than an absolute right — one that can be infringed, for example, on the grounds of public safety, or to prevent a crime, or in the interests of national security. Few would agree that criminals or terrorists have the right to plot in secret.
What the widespread use of strong, well-implemented encryption does is promotes privacy to an absolute right. If you have encrypted a hard drive or a smartphone correctly, it cannot be unscrambled (or at least not for a few hundred thousand years).
At a keystroke, it makes absolute privacy a reality, and thus rewrites one of the fundamental rules by which societies have been organised. No wonder the intelligence services have been scrambling to tackle our deliberately scrambled communications.
And our fear of crime — terrorism in particular — has created another issue. We have demanded that the intelligence services and law enforcement try to reduce the risk of attack, and have accepted that they will gradually chip away at privacy in order to do that.
However, what we haven’t managed as a society is to decide what is an acceptable level of risk that such terrible acts might occur. Without that understanding of what constitutes an acceptable level of risk, any reduction in our privacy or civil liberties — whether breaking encryption or mass surveillance — becomes palatable.
The point is often made that cars kill people and yet we still drive. We need to have a better discussion about what is an acceptable level of safety that we as a society require, and what is the impact on our privacy as a result.
As the University of Surrey’s Woodward notes: “Some of these things one might have to accept. Unfortunately there might not be any easy way around it, without the horrible unintended consequences. You make your enemies less safe but you also make your friends less safe by [attacking] encryption — and that is not a sensible thing to do.”
Working at the White House, we don’t get easy problems, easy problems get solved someplace else.
And while the US can no longer dictate policy on encryption, it could be the one to take a lead which others can follow.
White House cybersecurity coordinator Michael Daniel recently argued that, as governments and societies are still wrestling with the issue of encryption, the US should come up with the policies and processes and “the philosophical underpinnings of what we want to do as a society with this so we can make the argument for that around the planet… to say, this is how free societies should come at this.”
But he doesn’t underestimate the scale of the problem, either. Speaking at an event organised by the Information Technology and Innovation Foundation, he said: “Working at the White House, we don’t get easy problems, easy problems get solved someplace else, they don’t come to us. This is one of the hardest problems I know about, certainly that’s anywhere close to my job. And I think it’s clearly not one that’s going to be resolved easily, simply or quickly.”
Which brings us back to those civil war codenames, Bullrun and Edgehill, which may serve as an inadvertent, gloomy prophecy about the future effectiveness of the intelligence agencies, unless we have a better discussion about how security and privacy can work together online.
If not, it’s worth remembering the Cavaliers and the Confederates both won the first battles of the English and American civil wars, just as both would finally lose their bloody and divisive civil war. Perhaps, after a few early victories in the new crypto war, the intelligence agencies may face a similar defeat, outpaced by encryption in the long term.
It may be that in a few decades, the spies look back at the tribulations of the first and second crypto wars with something approaching nostalgia.
Editor’s note:
Steve Ranger. “The undercover war on your internet secrets: How online surveillance cracked our trust in the web– TechRepublic”
TechRepublic. N.p., Web. 10 June. 2016.