Insights & Analyses

Transcript: Edward Snowden speaks at first-annual K(NO)W Identity Conference

May 15, 2017

Edward Snowden spoke at One World Identity’s K(NO)W Identity Conference on May 15, 2017. The following is a full transcript of his remarks, in conversation with Manoush Zomorodi of WNYC.

Manoush Zomorodi: [00:00:01] Hey everybody is great to see you here I’m going to lead this conversation as you can see the fire side is metaphorical. But I will point out that as you are being beamed into a federal building two blocks from the White House. I think I’m here as a journalist but also representing sort of the every woman every man, right the person who is going to buy your products the person who is posting on your platforms and living by the laws that we are trying to figure out as we spend more and more time living virtually. So I want to start out Ed with the situation that I think is helping people see just how important regular people see just how important their personal information is. And that is the global cyber attacks that have now hit over 200,000 computers in 150 countries…you recently said that we are dealing with the greatest security crisis in history, and that was before the attacks. What went through your mind when you heard about…when you first heard about what had happened?

Edward Snowden: [00:01:06] Yeah, so…it’s hard being right, in the worst possible ways, right, it’s an uncomfortable feeling. There’s that sort of natural inclination to be like ‘ugh why didn’t they listen,’ but at the same time there’s a natural understanding that there’s momentum, there’s inertia that exists in all of our institutions, in all of our policies, and particularly when it comes to how we allocate our budgets, where we put our spending towards. Now for those of you who haven’t heard the news yet—maybe you’re not working in cyber security, lucky for you this weekend—what we’re talking about is an extraordinary story, it’s never happened before in quite this way. All the media outlets are following up on it, and it’s basically a perfect storm of all of the problems that everyone’s been warning about for years now.

We have a US National Security Agency, which is sort of our state surveillance bureau in the United States, that is supposed to be aimed externally, towards the sort of foreign adversaries, military spies, terrorists…but because of changes in the politics that happened at the point of the Bush Whitehouse and September 11th, they started looking inside the country too. This is called mass surveillance. I’m not sure who wrote the introduction, but for the record I’ve never admitted to stealing valuable government material. I do happily admit having copied evidence of serious crimes and providing that to journalists, but I would argue that’s not quite the same thing. But when we look at what sort of happened here, why this is so impactful, is, look the NSA is, they’ve done a lot of harm to America’s rights, to Internet security broadly, but no one pretends that this is their intention. Right, nobody’s trying to burn down the NSA, I don’t advocate this, we’re not saying these people are criminals. What we’re saying is that good people often do bad things for what they believe are good reasons. It’s very easy to make mistakes here.

So how did we get to this point, where malicious hackers are shutting down hospitals, and since the hospital story broke it’s spread, we now have railway stations that have had their terminals infected, automobile manufacturing plants in France have been shut down, Fedex in the United States has been impacted by this. And it’s actually getting worse rather than better as more variants that are more difficult to combat are being spread around the world. So this is really the central issue. Microsoft has come out and gone—for the first time this is fairly unprecedented, actually it’s totally unprecedented—they fingered the NSA.

Journalists, newspapers that are sort of talking about this issue typically use language that provides the NSA a little bit of a fig leaf here and goes ‘cyber attack tools or digital weapons allegedly stolen from the NSA.’ And there’s one organization in the world that is not a government agency that knows whether or not these tools in fact originated with the NSA. And that is Microsoft. Now Microsoft’s president and chief lawyer just made an extraordinary statement last night. I’ll read from it very briefly where he goes look these were in fact taken from the NSA. They are now being used against customers around the world. And this demonstrates something that is…we all need to pay attention to. Because no matter who’s at fault, no matter who is responsible here, we have to fix the problem, because it’s getting worse. They said this attack provides yet another example of why stockpiling computer vulnerabilities by governments is such a problem. This is happening around the world, this is not just in the United States.

This is an emerging pattern in 2017. We’ve seen vulnerabilities stored by the Central Intelligence Agency—right these are top secret, classified, they’re in air gap networks, they’re not connected to the internet—show up on Wikileaks. And now, this vulnerability that’s ransacking the world, stolen from the NSA, is affecting customers no matter their nationality. Repeatedly, exploits in the hands of government have been leaked into the public domain and have caused widespread damage. An equivalent scenario to what we’re seeing happening today would be conventional weapons, produced and held by the US military, being stolen, such as tomahawk missiles. And this most recent attack represents a completely unintended, but disconcerting link between the two most serious forms of cyber security threats in the world today: those by nation-state actors, and those by organized criminal groups. It’s hard to think of a circumstance in which the problems are more clear, which begs the question: “How did we arrive here?’

And we’re fortunate because just a few months ago, for the first time that I’ve ever seen publicly (and I can say this with some authority having worked at the NSA and the CIA, and of course work now for the public in opposition), that the National Security Agency’s Deputy Director who just retired, Richard Ledgett, confirmed for the first time, NSA’s cyber security spending is 90% dedicated to offensive operations. This is the central problem that we’re seeing around the Internet today, again and again, but it should not be this way in the United States in an organization, an agency whose name—the National Security Agency—implies a focus on security rather than defense.

Manoush Zomorodi: [00:07:04] OK , I want to ask you, though, how do we reframe this conversation because we, you know, the people, are hearing that well, I’m sorry, your privacy, if you want security, privacy has to go just a little bit, it is the price we pay to keep this country safe. And is this a false tradeoff, I’ve heard you say it is, but is there a better way to frame the relationship, then, between privacy and security?

Edward Snowden: [00:07:29] It is. So this is one of the most popular talking points that we see coming from those who are kind of apologizing for the harms of these programs and saying live with them and saying “look, if you want to have any of your rights, the first and most important thing that you ever need is security.” But, they’re not really talking about security; they’re talking about surveillance. And when we actually evaluate the usefulness of these programs, for example those that were revealed in 2013…in June of 2013 when I came forward and these things were first revealed, the President of the United States said exactly this, he said look we’ve drawn the right balance between your rights and our need to surveil, between security and privacy. But later in the year, facing continued criticism and more stories that came out showing that these programs weren’t really effective, he appointed two—this is to his credit—two independent commissions to review what’s actually going on with these programs. They had total access to classified information, include people who are no friend to civil liberties, such as former Deputy Director of the Central Intelligence Agency Michael Morell, and went do these programs work, are they legal, should they continue? And the reports of those committees said in fact, this kind of mass surveillance, they had never identified a single instance involving a threat to the United States in which these kind of things—the telephoning and mass surveillance program—had made a concrete difference in the outcome of a terrorism investigation.

[00:09:04] Moreover they had never found a place where it even contributed to an investigation in a concrete and unique sense. There were, of course, edge cases where maybe it was interesting or helpful in some way. But it was small, and they said it could have been achieved through traditional means that we already had that were less intrusive. Now when we talk about this reframing and why it matters, the central point that people need to understand is this has never been a conversation about privacy vs. security. Because privacy and security improve together. Right, they are actually tied to each other. When one is reduced, the other is reduced. Surveillance and privacy are the contradictory factors. When surveillance increases, privacy decreases. And unfortunately…when surveillance increases security typically decreases. Now that might not seem obvious at first glance, but when you think about how surveillance actually functions it becomes quite clear, particularly in the computer security context. Surveillance operates by observing, witnessing and exploiting vulnerabilities, right. Whether that’s you walking out on the street where you can be observed, rather than within the four walls of your home, that’s exploiting a property, right, where you are insecure, and using that for the interests of whoever sort of runs the surveillance thing.

[00:10:34] Now when we think about the Internet and Internet surveillance this is particularly problematic, because the way internet surveillance works is the same way: communications that are being transmitted unencrypted, or electronically naked, unprotected as they cross the internet, can be observed, they can be captured, whether its by the criminal sitting next to you in Starbucks who is on your local wireless network sniffing communications that are going over the air, whether its telecommunications providers, who are seeing it as it crosses the switching points and then heads on to Facebook, whether it’s Facebook itself that’s mining these and then selling your data to advertising, advertisers making it available however they want, or whether it’s these governments themselves, which maybe you trust the National Security Agency, maybe you think they are the champion of truth and justice in the enlightened world, that’s okay, [but] recognize that the Russian NSA is doing the same thing, the Chinese NSA is doing the same thing, the French, the German, you know, the Brazilian…this is happening around the world. And in a borderless network, right, we need to be focused on security, on defensive measures more than we are focused on these offensive benefits of surveillance. Because when you cut those corners, when you focus exclusively on being able to watch people, on being able to attack adversaries, on being able to spy on people of interest, what you’re doing is you’re keeping those doors open that allow your adversaries to attack you in the same way. And this is precisely what Microsoft alleges the NSA did that led to the ransomware attacks of this weekend. They knew about this flaw—the National Security Agency—in US software, US infrastructure, hospitals around the world, these auto plants and so on and so forth, but they did not report it to Microsoft until after the NSA learned that that flaw had been stolen by some outside group, right? We still don’t know the identity of the people who actually did this. But the problem is, had the NSA not waited until our enemies already had this exploit to tell Microsoft, and then Microsoft could begin the patch cycle, but instead told Microsoft when the NSA first learned of this critical vulnerability, we would have had years to prepare hospitals networks for this attack rather than a month or two, which is what we actually ended up with.”

Manoush Zomorodi: [00:13:00] Ok so I want to drill down. We’ve we’ve mentioned security we’ve mentioned privacy but let’s talk about this word identity. You know this conference is called K(NO)W Identity. What in your mind constitutes identity. I mean Facebook wants us to merge all of our identities into one single account. What does it kind of mean to you.

Edward Snowden:[00:13:22] Yes so there are a lot of ways to look at this. The way that I think matters the most for computer based networks is we’re talking about an identifier right but we don’t want to use sort of the word identity to define what it is. So we want to start talking about name right some kind of authentication token some kind of credential that can be used to assert a need or a desire to participate in something right. Whether it’s wanting to connect to a network so you can share your voice. If this is a social network or wanting to be able to engage in trade whether it’s buying selling or transferring value right whether this is in a financial institution or something else online for those who are interested in like the block chain style of things these are a cryptographic identifier. Some kind of key that says I can spend this money where I possess this sort of you know crypto currency or whatever and I want to send it to this person the other it’s that critical point of going this is mine or this is me. Identity is about the self about distinguishing this actor from that actor from different perspectives.

Manoush Zomorodi:[00:14:48] I mean I would say though that that’s a very sort of American take on what the self and this idea that the individual is paramount and in some of the other countries that you mentioned I’m thinking of China. The self is not paramount. It is about community. It is about cohesion. How do we in this room start to think more globally about what identity is when we are so different. Think so differently about what identity means to our society.

Edward Snowden:[00:15:16] Well there are two ways of looking. This one is you want to be open minded and you want to look at other perspectives of the world but there’s another side which goes: Do we want to emulate China? Perhaps they have more people right and perhaps when we do a poll about what’s popular in the world, the Chinese model would be reflected as more popular than the American simply because they’ve got more people right. But even if that’s not the case even if France, Germany, you know whoever you look up to says this is the right model we don’t want to focus on what people want for themselves. We want to focus on what some institution wants for them, what some authority wants for them. Does that make it right? And this is where I would argue that something being popular is not the same as something being moral, which is the same as something being lawful not being the same as something being moral. These are distinct concepts here. But yes there’s a lot to think about here. Most of the problems that we face today in the identity space are derived from what I would argue are our structural flaws. We’re looking at top down models were very few Institutions are sort of saying your identity is your token that we’ve given you and every other sort of group must rely on our decisions right –whether this is your driver’s license, whether this is your passport or they say all of your other identifiers like your bank account and what not, must be ceded from this initial token that we’ve granted. But is that really the case.

And does it have to be that way?

Manoush Zomorodi: [00:16:55] So what technologies are you looking like speed forward into the future where do you think the most promise lies. Technology is part of the problem right now but where can it be part of the solution as well?

Edward Snowden:[00:17:08] So the idea is that right now businesses are having to invest tremendous amounts of resources into things that don’t really align with their core mission right whether we’re talking about “Know Your Customer” laws, other kinds of financial regulations that say look you guys have to do this. You have no choice. You’re going to engage in sort of our process. We’re going to deputize you to do our work. And it’s simply accepted that that’s the way it must be. But this is one jurisdiction and one place that has these things and then we start to see it spread to other jurisdictions. The EU has their own set of rules. And then we’ve got similar ones again in these other parts of the world that are less familiar places like China where they’re actually starting to sort of score citizens and assign them things the same way that we do credit ratings. And this is where we need to start doing heavy thinking about the fact that well when we look at the history of governments throughout all of civilization they’ve tended to fail unpredictably and with very harsh results, again and again and again –even the most well intentioned of them. The least well-intentioned of them tend to do even more poorly. But what does this mean? This means we need to focus on enabling people to do what they actually need to be able to do well also providing some capability for those who are in a position of power, of influence, of privilege, in these platforms, right, whether we’re talking about trade whether we’re talking about communication to move things in the right direction.

[00:18:58] If it’s a private platform this can just simply be in line with their corporate values, right, whatever they think their mission is, whatever they think is the best thing for the world as they see it because they are a private institution. Right. When we’re talking about public institutions then we’re starting point to talk about laws and regulations statutes and what the public will hear. But we also need to think I think centrally about getting back to that mission of enabling people to interact across distance to achieve goals that are net positive for the world.

[00:19:32] . Sharing some of these beliefs is a great example of this. And we’ve seen ways that this can go wrong with things like ISIS videos and whatnot being splashed over Facebook. But we’ve also seen the wrong responses that are provided to them. Facebook and many other organizations around the world are now saying well we are going to become the censorship and start making decisions about what can and cannot be said. Now look there is no debate about the fact that we don’t want a beheading and jihadist propaganda spread around social networks but are social networks the ones that should be policing this? Are they the ones that are best placed to decide this? Probably not. This is why we have courts. This is why we have law enforcement. This is why we have intelligence agencies. They are the ones who are actually invested with the capabilities by the people by the public to make decisions that are fundamentally violations of rights. Right. If you are censoring someone, even if it’s jihadist propaganda, that is impinging on their freedom of speech. But it is justified right based on what’s going on?

[00:20:48] Courts and police are the ones who traditionally make these most difficult of decisions because they have processes, they have accountability to which they can be held in a very high standard process. When a private company does this, even if Facebook starts in the best way in the most careful way, when we look at that 10 years down the road, 20 years down the road it’s going to look very different.

Manoush Zomorodi:[00:21:12] So in terms of the regulatory side of this are the Europeans the best model we have right now. You’re talking about privacy laws. They are the strongest in Europe. They’re talking about fining companies like Facebook for hate speech and other things they also have the highest data protection regulatory laws. Is the EU where we need to be looking?

Edward Snowden:[00:21:33] So this is again where we’re going into a complicated space because the answer is yes and no. Right, they do better in some places we do better than others. There are… nobody has it right when we’re talking about government regulation here. If we want to look at things in the most freedom preserving ways, the technological bases that we’ve seen in the past are probably the best way to look at this. What we’re really looking for here is reputation space where vendors don’t need typically is a true identity. We might feel that we have to now because of this know your customer regulations they might be kind of imposed upon us but to make your true business decisions, you don’t actually need to know someone’s name. You need to be able to get something to them. You don’t need to know their precise age. You need to know whether they’re old enough to engage in a certain kind of transaction. You need some level of knowledge, but you don’t need precise knowledge. You don’t need an overabundance of knowledge. And these kinds of things have been sort of proposed to be provided through attestations. In the 1990s there was a model for an encryption protocol that was called “the web of trust.” And the idea here was that everybody who had an established identifier in the system could then say things about themselves, they could communicate, but they could also sign things in a mathematically verifiable way, say, “I am this person.” But you might not be inclined as a business to trust that this person is who they say they are.

[00:23:18] But what if another person, who is known to you, a former customer who you know is legitimate has also said this person is who they say they are. And then another person does. And the other person does this. There’s no central authority that’s involved at any point in this process. But eventually you get a level of confidence that is enough to do your business. And this can happen automatically, programmatically, in a frictionless way if we build systems that go in this direction. We don’t have them today. The web of trust model was a very crude first draft that didn’t function at scale. But the idea here is we should not be looking to specific institutions to be policing these things. We should be looking at a wider broader more reliable social fabric right– a decentralized system where there aren’t these single points of failure. So you might be thinking about this in the context of the United States government right now and that would be sort of a mistake. I think you might be going well look, the U.S. government issues passports. I can rely on those. We’ve got a stable system you know like it or not politically it’s pretty predictable. If the government gives a passport to somebody that’s good enough. But then if you think about recent history and what we’ve actually seen in the world there are a lot of problems that. One of which is that how does someone present a passport to your business. Today even for “know your customer” groups rights, banks, accept digital scans of these identity documents that are transmitted or email. In an hour in Photoshop will get you a new passport that doesn’t conflict with anyone else’s. That’s got the individual’s face on it.

[00:25:03] It is impossible to tell it’s not legitimate if it’s professionally done in a quick manner and reliable manner. But what about, you go, well I’ve got anti-fraud measures. And I’m comfortable my guys could detect a photoshop job or whatever. What about legitimate passports that are being presented by groups that are not legitimate. This is something people haven’t seen yet but it’s likely to increase in the next decade. It’s people who are from poorer countries right– let’s say they’re in Eastern Europe, let’s say they’re in Russia or China let’s say they’re in Africa. They will go into darknet markets. Right. These are sort of places where people engage in transactions that are not connected to their legal identity. Right. So there’s very little recourse here. And they will sell scans of them with their face holding up a picture of their passport specifically for people in other countries that are not their to assume their identity and use it in trade. And beyond this, there are truly legitimate, at the same time as fraudulent identities documents, that get out into circulation with more regularity than you might expect for not ordinary customers. Right. Not the guy who’s trying to do a chargeback scam on Amazon but for a terrorist who’s actually trying to move money. Right. That’s something that would actually be a high impact even on society that we want avoid. In Syria for example due to all the warfare passport control officers were captured by enemy forces.

[00:26:43] The very first thing that happens when they do this is they provide real passports, right, with the same equipment that the government uses to their insiders right to their spies and what not on other names, other identities. But then once that’s been done, they start selling the leftovers to generate revenue for their movement whether it’s through rebellion and whether it’s jihadist activity– whatever. The same thing happened in eastern Ukraine. And these are the kind of things that we need to start thinking about when we rely upon just this institution gave this person this token, that’s not as reliable as looking at a cloud of verifiers that can be provided by social fabric that don’t actually require any particular particularized institution at all. And in the future I think we’ll be more reliable than what we have today.

Manoush Zomorodi: [00:27:37] Is that blockchain that you’re thinking of? What could that even look like?

Edward Snowden: [00:27:42] Well it can be in a lot of different things. I’m not thinking about the blockchain specifically here. Although the blockchain is good for a lot of reasons. It allows you to establish something happened at a specific time and freeze it in a public ledger right where anybody can go, “all right this person signed up at this time” or “this person was at this bank at this time the bank said they were, they asserted these identity documents.” And it sort of crystallized there. Frozen there forever. It’s not going to be lost unless the entire network across the world is lost because it’s spread across jurisdictions, because it’s decentralized. That’s very unlikely to happen. But when we think about it more generally, what we’re talking about is looking at technical solutions where you can buy a bucket of Internet like you can buy a bottle of water.

[00:28:34] You don’t need to present a particular you know giant portfolio of documentation or things like that for transactions that are not high risk. You know if you’re worried that somebody is buying a dual use product that could be used in you know a nuclear proliferation proces, yes by all means. Let’s do some identity verification here. Let’s look at a cloud of different indicators. Let’s cross compare things like that. But if somebody is just trying to browse for stuff on Amazon as long as they can assert that they have the funds to pay for this and you know they’re legitimate because for example they go through an escrow service doesn’t really matter to you what their identity is. And it should not matter. We should have a digital economy that is much more close to a cash economy than it is today. These points of friction are one of the reasons that we’re having difficulty driving Internet growth in the economic space into this last 20 percent of the human population that truly doesn’t even have identity documents. They shouldn’t need them.

Manoush Zomorodi: [00:29:46] Yeah I was talking for the privacy paradox project. We spoke to Sir Tim Berners-Lee who’s working at MIT to sort of flip the business model if you will where we would each own our own personal information. And I want to go to some listener or sorry. I’m not on a podcast. I’m in real life. I want to go to some audience questions that I’ve been given. So the first one being the Internet of Things of course promises to usher in an era of unprecedented connectivity and data generation. What incentives: market, legal, self-regulating, should be put in place for the Internet of Things? Today many devices as we’ve discussed are shipped with known vulnerabilities or in secure default settings.

Edward Snowden: [00:30:29] So this is one that it’s very controversial among technologists because you know they don’t want regulation to be imposed in any way shape or form in what kind of products they can put out there. But the central problem that we’re facing with security around the internet thing right is somebody will put out you know a baby monitor, a refrigerator, or a washing machine that is Internet connected but it doesn’t have a support chain. When weaknesses, when vulnerabilities like this sort of ransomware thing we see sweeping the world today occur they’re discovered. They become publicly known. There’s no way for the user to update their devices or even know that these devices are vulnerable because the company has stopped producing them. They’ve stopped supporting them and just moved on to the next target, the next little widget that they want to cram the Internet in and then put in your home. The problem is they’ve broken the liability chain that exists in every other industry in every other space, when we start looking at these things. If you sell a car to somebody and then after 10 years these cars you know right as soon as they hit the 10th birthday, they spontaneously ignite and burn whoever sitting in them alive. That company is going to have some liability there. They’re going to have to either put out something to all of their service centers that say look recall these things figure out we bought them, notify them, tell them to come in. Or, they may have some legal liability if they’ve been negligent in the design of their products.

[00:32:12] The problem is for the internet of things these aren’t happening ten years later, these are happening 10 months later and there’s no way to hold the manufacturers to account because there is no regulatory statutory framework here at all. Now is that statutory regulatory framework correct. Is that the right way to go forward. We don’t know yet but we do know that this lack of liability is the core distinction between things like producing pharmaceuticals or other sort of things like hospital equipment that people’s lives rely on. But if it’s a computer that’s sitting in a hospital that people’s lives are relying on, these same standards aren’t applied. And that’s really the central point.

Manoush Zomorodi: [00:32:54] So you touched on this, but do you think existing regulations on your customer and money laundering are effective tools encountering legal actors including drug and human trafficking funding of terrorism violence et cetera.

Edward Snowden: [00:33:10] No they’re not. They are perhaps helpful, perhaps useful in some edge cases. Right. We do get dumb terrorists. We do get people who are wannabes you know who go out on Twitter on Facebook their and their real name. And you know they say, “death to the infidels” and “I want to shoot up a courthouse.” And then they go online on Amazon and they try to you know purchase the materials that are obviously for a bomb. And when you put all these things together police can go hmm. Yeah but essentially incredibly rare, vanishingly rare, so rare in fact that we don’t have any good cases in public evidence of this happening that we’re not driven by the FBI. At least in the United States themselves right where they were sort of they had a covert informant who was in this person’s community, sort of held them by the hand and watched them along the path of radicalization until they got that point where they could rest. However we do have many cases in public evidence where these “know your customer laws” did precisely bupkis in preventing the flow of resources whether we’re talking physical, whether we’re talking monetary, whether we’re talking informational, across the world. Because people can bypass these things if they have enough sophistication to operate at this level. And I hate to tell you ladies and gentlemen again I worked at the NSA. I’ve read terrorists communications first hand. And I read Chinese military hackers communications first hand. The people that were actually worried about are not the ones who are going to be held back by these kind of regulations. Ordinary people. Yes. Petty criminal. Yes.

[00:34:58] Businesses and their agility. Yes. But when you start talking about terrorists you have to understand the top tier of terrorist activity is fundamentally Darwinist in its approach. Osama bin Laden didn’t stop using a cell phone because he read about it in the newspaper. He didn’t stop using a cell phone because you know the Washington Post or CNN had a breaking news report. He stopped using his cell phone in 1998 because he noticed the hill from which he made a satellite call the day before was struck with a Tomahawk missile the next day. Right. And it doesn’t take you know a startling act of genius to put two and two together. The bottom line is dumb terrorists get weeded out of the system very quickly. Dumb criminals get weeded out of the system very quickly. And “know your customer” laws aren’t going to be a reliable safeguard against the sort of social threats that we’re actually worry about. So we should be careful about engaging in a suspension of disbelief in terms that are like as long as the government says this has to happen. this is all we need, or that this is even effective. Now this is a big business right. And I don’t mean this to say you know anybody’s products or services here are completely valueless. Certainly these things all help in some way. But the bottom line here is not can we stop this person from doing this in this case. You can’t stop all criminals everywhere. If you could we would have done it by now.

[00:36:47] The only world in which you can foreclose on entire avenues of human activity, even violent and even criminal activities, are a world in which everybody is sitting in a jail cell. Yes it’s very predictable. But it’s not very free.

Manoush Zomorodi: [00:37:05] So this one is a little bit more personal looking at the companies from inside out. Today private companies employ both physical like security cameras and digital surveillance like logs and audit trails of both employees and customers. This can help detect and thwart behavior such as assaults or improper use of company data like looking up your ex’s records for example. I feel like this person is a very specific question.

[00:37:28] How do you think about privacy in this context?

Edward Snowden: [00:37:33] I think that was actually a reference there to what was called the Loveint scandal. The work of the NSA is called SIGINT signals intelligence. And all of these mass surveillance tools we talked about where all the phone calls that you make right now, all the e-mails you send, all of your Internet traffic it goes into government data. Now the government says and they’re actually right about this in most cases they’re not listening to your phone calls. Right. They’re not reading your e-mails but they are collecting them. They are intercepting them and they are going in the bucket. I know this because I’ve seen it firsthand. Courts have affirmed that this is actually happening. Litigation is ongoing about this. That’s non-controversy. The problem is some people at places like the NSA. We have more than 12 cases that come from by by now. Congress has confirmed is where they went, mm yeah I could be looking at terrorists communications today or I could figure out if my romantic interest is talking to some other guy. And if you have records of all their phone calls you can do that. When you start collecting everything that’s happening on your network across the Internet, that’s happening with your customer base right. Even if these are wholly legitimate communications that are, and we’re being generous here, collected for a wholly legitimate purpose. Right. You’re trying to counter threats. What you’re actually producing here is what academics refer to as a database of ruin. When you create a database of people’s activities their private lives, private activities that is rich enough. that reaches a certain level of granularity, you can find incriminating things in there about any one. Right. And people eventually will succumb to temptation no matter what the safeguards are because somebody has to be authorized to access these records for them to use. And they will abuse them. So the question becomes what do you truly need to collect? What is absolutely vital and necessary to collect, and not if but when they are abused, what will you do about it?

Manoush Zomorodi: [00:39:40] OK so this one’s a little bit. I love this question. With human like AI and robots just around the corner, how do you think identities for this type of AI should be handled? Should their identities and connected privacy, and “civil rights” be closer to simple identities usually associated with Internet of Things identities or closer to the complex digital identities associated to humans? Are you thinking about this stuff, Ed?

Edward Snowden:[00:40:08] Yeah I mean I’ll be honest that’s not the kind of thing that I spend all day thinking about because unfortunately I have to live a little bit closer to the now in my daily work for reform. But the idea centrally here is, look we’ve got human identities, we’ve got machine identities. When I worked at the NSA right there were two broad classes of analysis work that happened in mass events. There were what were called persona analysts. This is the easy job right. Anybody can be taught to do it. And this is where you basically get assigned to read a terrorist Facebook account. They look through all their Gmail all day, see what they’re doing online all day. You’re tracking people. It’s not that hard to read somebodies activities if you have the language skills right. On the other hand there’s a much more complicated side which we call infrastructure analysis. Whereas instead of looking at people you look at devices. You go, we see that this defense contractor or hospital or whatever got hacked. We see all of these other communications around the where all these other computer were in contact with it at the time. Which one was the bad one? Which one was the one that actually launched the attack? And this is you know is a very technically complex space. But the idea here is that you have to be able to figure out you know which of these machines doing what was the bad actor. And you can’t always do that in an easy way because computers don’t have license plates. Right.

[00:41:40] People like to analogize IP addresses. You know Internet Protocol addresses sort of each cable modem you know like some of these house has one of these that all you know iPhones or whatever in the house share when they go out over the Internet. But it’s not a direct proxy because every device in the house or on the college campus for example is sharing that same IP address when it goes out across the network. So now you know. All right. Well the attack came from this college, but which student was it? And that’s where it starts getting really difficult because there are no identifiers. So instead you have to work in a space where identity doesn’t really exist. And you have to look instead for behaviors. You have to look for indicators of criminal activity. Basically you’ve got to see the missile crossing the internet and go well everybody else was throwing flowers and some person was throwing a hand grenade. Let’s go after the person who threw the hand grenade. And this is actually a useful proxy for people thinking about the identity space. If identity did not exist, how would you do your business? And this is actually the kind of world that we want to move closer to. Whether we’re talking about artificial intelligence, whether we’re talking about washing machines that don’t have identities, we can survive that world. We can survive a world and we have throughout history, where identity was never a component in the vast majority of commercial transactions. If we rewind a hundred years, if we really want a thousand years you know when somebody went to the local merchants when somebody went to the bank. They were either known because of social ties because of the presence in the community or they were not known at all and they were still able to engage in trade. We have to remember that no matter that there are real threats in the world and we should all pay attention to them when we should all do our best to do what we can to mitigate them to counter them. Most transactions are legitimate. Most transactions are by ordinary people trying to do ordinary things and they should be enabled. If we hamstring everybody by putting you know, instead of greasing the skids of commerce and participation and communication, we lay down a road with tacks and lava and crocodiles and say only people who are able to dance magnificently across the backs of these crocodiles without getting bitten– the sort of privileged and capable the world, who have perfectly their identity documents and what not– we will be creating a world that is not just less free but a world that’s fundamentally less fair and ultimately a world that is less capable because there are less people who are able to participate in it.

Manoush Zomorodi: [00:44:36] OK I’ve got time for two more questions. One is very specific. And then the latter is a little bit more philosophical. This is an audience question when it comes to balancing privacy and security what is your number one advice to startup companies? And I would add what is your number one advice to their customers?

Edward Snowden: [00:44:55] I think the key here would be to focus on, do you need identity? What is the bare minimum you must collect? Because increasingly holding more information is becoming more of a liability than it is an asset. Start thinking about other models where you can tokenize identity , where it’s sort of a little bit more like a bearer bonds used to be in the last century. Somebody has a token of value if that is a spendable token value, they can spend it. If they’re dealing with something that’s a particularly sensitive transaction maybe you need to look at this in a more careful way. But for the vast majority of interactions, do you need to be assessing identity at all? Do you need to be looking at how to exclude people? Or should you be instead be taking a more positive approach to this? How can you enable more people to become your customers and the fewer requirements that you can impose on them, that you must impose on them, the fewer or the smaller the amount of surveillance and retention requirements that you have to impose on yourself as a business, that are you know at best sort of parallel to your business mission. The more efficient you can be. If your have this enormous you know data center that you’ve got to run or you have to build these commercial relationships with third party identity providers that are not related to your business, you’re incurring costs without benefit.

Manoush Zomorodi: [00:46:38] And for the customers then? Because my listeners are like well but the more personal information I give them, the better personalized product I get back. Like Google works better because it knows me so well and it’s free.

Edward Snowden: [00:46:52] Well the simple way to do that is just through opting in. Right you don’t start collecting all of this with everybody. You demonstrate that that is a bargain that has value. And this is something that’s present like there’s a lot of this stuff. I mean let’s be honest here. Everybody in the room is professionals here at this conference for a reason. They’ve got these end user licensing agreements the Terms of Service and things like that. There’s an “Oh yes. We collect data on you to make sure that we can improve our products and services for you.” There are cases where that’s true. But everybody in the room knows the vast majority that’s a fig leaf. It’s a legal fiction to say we want to collect data in order to monetize data to try to make this maybe you know an extra sort of source of revenue. Right or wrong it’s understandable right. What they’re trying to achieve there. But what we very rarely see is in the on boarding process of an application or a web page or something like that you start with the functionality that people actually want that people come to the Web site to use. And then they’ve got the equivalent on the site of discoverable features. Right. Whether these are little in page sort of things that are popped on there so people can see them. Or it’s just from people poking around or seeing friends do them. Or communications, right, advertisement either from the company or external things that say hey did you know that if you do this –you know tell us your location– you’ll be able to get weather updates. Nobody does that nowadays. Instead they go, all right let’s snatch this person’s IP address see if we can geo locate it to a rough approximation and then just show them the weather. It doesn’t matter whether they need the weather or not. Right now we have their location data and maybe we can use that for something later on. If we’re being realistic here, we can understand the commercial incentives here for that, that drives that kind of behavior. But we should also be realistic that we’re not fooling anybody. Guys this is really not a moral or ethical activity.

Manoush Zomorodi: [00:48:54] OK so last question on that very human note as we sort of have last words when everybody here in this room is going to go out they’re going to talk. They’re going to tell each other what they’re working on and I guess I’m just sort of wondering what do you think, this is sort of philosophical, what do you think the implications are for each of us psychologically if we don’t figure out how to present our identity protect our privacy and deal with our personal information going forward?

Edward Snowden: [00:49:25] And this is a great question because it gets into that central talking point that we’ve heard again and again that actually underlies a lot of the “Know your customer” stuff. Which is look nobody should worry about this stuff if you have nothing to hide you have nothing to fear. But privacy isn’t about something hide. Privacy is about something to protect. It’s about who you are who you can be. It’s about the ability to make a mistake without having it follow you for the rest of your life. If we create a world of structure sort of authoritarian identity where you’re given this sort of magic piece of paper or chip in your driver’s license or your credit card or whatever and that’s solid crystallized the one identity that’s everywhere and everything for all things. You have that that kind of thing that people in high school used to threaten where you know this will go down on your permanent record. You truly will have a permanent record that will follow. No matter what you do. A child that is born will not have the same benefits that you did of making a mistake and doing something embarrassing, of saying something stupid, that they were then able to move on from. We will create a permanent class of people who are stigmatized either because they made a mistake, because they made a bad choice whether that’s a financial choice or whatever. Maybe they even committed a crime. Right. This person went to jail. They were convicted of theft or whatever. But society typically likes to believe in things like OK you’ve paid off your debt to society. Now you can begin again.

[00:51:03] But when we have people that can be tracked on the basis of the records and there’s no way to live outside of this context this chain of records, what we have become is a quantified spider web of all of our worst decisions. And you know this may be the most wonderful thing in the world for people who work in the insurance industry or something, but it’s a very negative thing for free and open society because now everybody in the world will think twice before they even open their mouth because they’re wondering what that’s going to look like in the database. That’s a very very dark future. Ladies and gentlemen it’s certainly not inevitable. One doesn’t even need to say we’re on the road to it right now but we do hear a lot of conversations. We do hear a lot of rhetoric that’s starting to make people near the edges of that. And if you hear that conversation starting, I think we need to just take a minute reflect on that and go OK. That’s something that we can do. But is it something that we should do?

Manoush Zomorodi: [00:52:18] Thanks Ed. Ed Snowden.

Edward Snowden: [00:52:19] Thank you very much