Privacy by Redesign: A New Concept
Canadian Privacy Commissioner Promotes New StrategyA new concept called Privacy by Redesign, by Dr. Ann Cavoukian, Privacy Commissioner of Ontario, Canada, looks to bring privacy into systems that are already developed. To do so, organizations need to look at the uses of data, what is permissible and what isn't, and create a consent management system.
"How can we expand the notion of embedding these protections proactively into the system so that it automatically knows when to seek out additional consent," Cavoukian says in an interview with BankInfoSecurity.com's Tom Field [transcript below].
Looking at existing systems, Privacy by Redesign would focus on the areas where personally identifiable information is involved. Once sensitive information is located, automatic triggers need to be implemented to ensure that appropriate identity management protocols and limitations are in place on the data, keeping its use minimal and only for what it's intended to be used for.
The problem now, with implementing privacy into existing systems, or emerging systems for that matter, is attempting to get the entire organization on the same page, Cavoukian says. "You have to cut through this siloed thinking of we've got this department versus that department," she says.
Organizations currently exist in a divided environment, working separately instead of together. Engineers for example, in developing systems, don't have a grasp on privacy and aren't expected to. But if the dialogue was open from the beginning, allowing privacy professionals to offer their input, systems could be that much safer from the start, or in the "redesign" phase.
In this exclusive interview about global privacy trends and strategies, she discusses:
- The core tenets of Privacy by Design;
- The goals of Privacy by Redesign;
- What organizations can do today to improve their privacy posture.
Dr. Cavoukian is recognized as one of the leading privacy experts in the world. Noted for her seminal work on Privacy Enhancing Technologies (PETs) in 1995, her concept of Privacy by Design seeks to proactively embed privacy into the design specifications of information technology and accountable business practices, thereby achieving the strongest protection possible. In October, 2010, regulators from around the world gathered at the annual assembly of International Data Protection and Privacy Commissioners in Jerusalem, Israel, and unanimously passed a landmark Resolution recognizing Privacy by Design as an essential component of fundamental privacy protection. This was followed by the U.S. Federal Trade Commission's inclusion of Privacy by Design as one of its three recommended practices for protecting online privacy - a major validation of its significance.
TOM FIELD: As an introduction to our audience, would you tell us a little bit about yourself and your latest work, please?
ANN CAVOUKIAN: Certainly. We're going to be talking about Privacy by Design today, and this is something that I developed, a concept and methodology, back in the '90s. But it's really taken off in the last few years. First let me tell you what I do in my day job. I'm a privacy regulator, the privacy commissioner of Ontario, Canada, and I oversee compliance with both freedom of information and protection of privacy legislation. That applies to the public sector at the provincial or state level and municipal levels. Also all health information medical data is captured under our acts, under PHIPAA. You have HIPAA in the United States. We have PHIPAA in Ontario.
What is Privacy by Design about as distinct from this legislation? With respect to the laws that I mentioned to you that I oversee compliance with, I do act as a regulator. I oversee compliance. I want to make sure that the government and healthcare people, hospitals, etc., follow these laws. And if they don't, you can file a complaint with me. We investigate. I have order-making power. It's very strong. I can make organizations do things. But that ultimately isn't my wish.
What I'm finding is increasingly, with the growth of Wi-Fi, wireless communications, social media and everything going online and morphing into the cloud, less and less is coming to my attention. If I see the tip of the iceberg in terms of actual data breaches that happen, I'll be lucky. I want to broaden my reach to address more and more privacy-related issues. And I thought the way you do that is not through a regulatory compliance protocol, where the harm arises and then someone complains, we investigate and we offer some system of redress through legislation. It's too little too late. I want to prevent the harm. How do you do that? You embed Privacy by Design into existing systems, business practices and network infrastructures. That's what it's all about.
Privacy by Design
FIELD: You're really the pioneer of this concept you introduced here, Privacy by Design. What would you say are the core tenets of this concept?CAVOUKIAN: There are three core tenets, but I should tell you there are what we call the seven foundational principles of Privacy by Design, and I'll run through them in a minute. But there are three absolute essentials. One, you have to be proactive before the fact, not afterwards. It has to be embedded into the design of a new system. It can be a computer system, it can be a business practice or it can be a new protocol. Whatever it is, it's got to be part of the emerging system being developed as core functionality, not bolted on afterwards, after the system has been developed or after the procedure has been developed, as an afterthought. What I find is that rarely works as effectively and usually doesn't work very well. And it costs much more to bolt on protections after the fact. It has to be proactively embedded in design, ideally, as the default setting. We know from the academic literature that whatever the default condition is, that condition rules 80 percent of the time. The default rules 80 percent of the time. I want that to be privacy. By default, I mean it's automatically available to the user without them having to ask for it. It's embedded; it's built into the system.
The third and possibly most important tenet of Privacy by Design is all about a positive sum, not zero sum functionality. By that I mean, traditionally, privacy has been viewed in a zero sum manner. It's privacy or security, privacy or business interests. It's always privacy versus this other interest. I've been doing this a long time, over 20 years, and I can tell you when you pit privacy against security, for example, it's never privacy that wins. It always falls off. Security leads, which is understandable. And let me be clear, security is essential to privacy. You cannot have privacy without strong security, but you can have the reverse. What I tell people is get rid of this dated zero sum model of one or the other. Substitute a positive sum paradigm, by which I mean you can have two positive interests working together, both increasing incrementally in terms of a positive presence in the system. I know from security technologists who I've spoken to and engineers, they've told me that the few times they've actually been tasked with building systems that have both privacy and security as core functionalities in the design of their system, it has invariably elevated the entire level of protection offered by that system, not one against the other. The one versus the other, it's a false dichotomy. It's an unnecessary tradeoff. Get rid of it. Substitute "and" instead of "versus." Give me privacy and security, privacy and marketing. You can always do both. So those are the three essentials.
Privacy by Redesign
FIELD: Now, I understand you have a new concept, Privacy by Redesign. What does that entail?CAVOUKIAN: We just hatched that last year. Privacy by Design works best when you have a new, emerging system. You're starting from scratch, you're designing something and you're just starting off. It's easy to then embed privacy - or easier to embed privacy - into the system, because you're not working from an existing system. When you think of the smart grid and smart meters that are just beginning to be rolled out, that's an ideal candidate. In fact, we've done a ton of work in that area, both in Canada, Europe and the United States. Emerging systems, nascent systems, are ideal for Privacy by Design because you can actually embed privacy as a design feature.
What if the design features are already there? What if you have an existing system or a legacy system? Last year, I spoke in Phoenix. I spoke to American Express and they were wonderful and very interested in Privacy by Design. But they said, "You know commissioner, we're really interested in this, but as you know, most of our systems are already in existence. They're out there. How do we adapt Privacy by Design to existing or legacy systems?" I thought, that's an excellent question.
And that day I had meetings at Arizona State University, which has the first Privacy by Design research lab. I was meeting with the lead professor there, and I said, "Let's figure out how to do this for existing systems. How can we redesign existing systems to ensure that these essential three tenets of Privacy by Design can be incorporated into existing systems that are already on the ground operating?" We sat down and we hatched Privacy by Redesign, which is an extension of Privacy by Design, but its application is better suited for an existing system. And I should tell your audience, if any of you would like to come to a workshop that we're offering on Privacy by Redesign, the very first one we're offering is in Mexico City on November 1 in the morning. If you go to our website, you'll find some information on that. You can sign up. It's free of charge. But you have to get there.
FIELD: Give us a sense, when we're talking about existing systems in this marriage of security and privacy, what needs to be redesigned today to be most effective?
CAVOUKIAN: When you think of all the existing systems that our offices use, for example, in terms of collecting data, hopefully limiting the uses of the data to what is permissible and then seeking some kind of consent management system after the fact to be applied to that if additional secondary uses of the data are sought out. These are some of the areas that we have to turn our mind to. How can we expand the notion of privacy as the default setting? How can we expand the notion of embedding these protections proactively into the system so that it automatically knows when to seek out additional consent from the user if additional secondary uses are being sought, especially by external third parties?
What we need are a few triggers in existing systems that aren't in existence today. And when these triggers are invoked, it in effect translates the system into a Privacy by Design system by automatically requesting additional consent management or additional identity management protocols, or whatever the case may be. We need these automatic triggers to be raised in existing systems that aren't there right now. That's some of the Privacy by Redesign.
And I should be clear. This only applies when you have what we call personally identifiable information, or PII. By that I mean any information that is linked to your name, address, social security number, driver's license, any unique identifier. Even if it's not directly linked, if through a process of data linkage it can be linked to your information, then that's the time that you really have to protect the data. That's when especially the Privacy by Redesign features would be looked at. When we look at existing systems, where are the areas where PII is involved? Then look at what the automatic triggers are to ensure that appropriate consent management takes place and appropriate use limitations are placed on the data to limit the use of the data to the initial purposes it was intended for.
Privacy Obstacles
FIELD: You've done a great job outlining the objectives. What are some of the obstacles that organizations are apt to hit as they try to get there?CAVOUKIAN: What I have to tell you is organizations, generally speaking, mean to do the right thing. There's no question. But what I've found is existing organizations tend to work in a siloed environment. By that I mean each department has their own area, and you work in this silo or that silo. Security works in security; engineering folks speak engineering language; computer folks speak computer; and then you've got the law department over here, and they speak legalese. Every group is a silo.
And the problem is with privacy you have to weave it throughout the entire organization in order for it to work effectively. It can't just be you develop the standards, you get the code written by the software engineers and then it goes to the standards group and then the business group and they sign off, and everybody's done it, and then you have privacy as an afterthought. It doesn't work. It doesn't work well, that's for sure. What those kinds of existing systems do is they lend themselves to unintended consequences because the engineers, through no fault of their own, have been tasked to develop a particular system, and they're going to do their best to give you that system. They haven't thought about privacy because no one asked them to build and embed privacy into the system. So they built it with a particular goal in mind that has nothing to do with privacy. And then the thing goes public and all of a sudden, boom, you've got this unintended consequence.
I'll give the example of Apple and the recent iPhone scare, where they learned that people's geolocation data could be discerned from the unique identifier linked to their mobile phone, which, when linked to other identifiers through your laptop or other device, makes it personally identifiable. All of a sudden you've got this unique identifier, which people say isn't personal information; it's just a unique identifier linked to this person's phone. Well excuse me, let's think this through. Can't the phone or the fact that it's yours be linked to that information? Of course it can. And we chronicled this in a paper that we just released, "Wi-Fi Positioning Systems: Beware of Unintended Consequences," precisely because the engineers and software designers who designed the system weren't asked to consult with any of the privacy people who could have told you in two seconds that this was going to be a real concern. So they designed a system and then out they go with the system. They roll it out and iPhones are great; everybody loves them. But they didn't think about the possible implications of accessing people's geolocation data that could be rendered identifiable.
These are some of the areas where people have to work much more globally across the entire organization. You have to cut through this siloed thinking of we've got this department versus that department. And they don't talk to each other until the product goes to market, and then you've got a data breach and the public goes crazy. This will impact your brand. It will impact your business practices. It will lead to lawsuits and class action lawsuits and it will cost a fortune. Avoid all of that. Avoid the harm by embedding Privacy by Design from the get-go, from the beginning.
FIELD: You just accurately described the state of the world in recent weeks, because we've seen a lot of privacy breaches and we've seen a lot of people go crazy.
CAVOUKIAN: Yes. And it's happened so many times. I can tell you this. As I said, I've been doing this for a while, and you see it again and again. It's not that the harm that arose or the data breach was intentional in any way - of course not - but it's through a disregard for the privacy interests being embedded into design. I call this the year of the engineer because I'm speaking almost exclusively to engineering audiences and software designers to give them this message. They have been extremely receptive. It's never that they say, "No, we're not interested. Go away." They say, "No one has raised this with us before. It's never been on our radar." Why should it be on an engineer's radar? Their job is different. They're coding. They're code writers; they're designers; they're designing functionality into new products and services. You need some of the privacy people which hopefully also exist in the company to speak to them and to be part of that management group that is speaking to the engineers and giving them their design specs.
Now, one of the things we're doing in my office is we're operationalizing Privacy by Design. As I said, there are these seven foundational principles and we're taking them and translating them into code. How would this appear in code, in engineering speak and computer speak? Because what is a policy, basically? What are principles? They're rules. You do this and you don't do that, and if this, then that. They essentially can be translated into the language that engineers understand, and we can easily then embed into their design and functionality.
And this applies across industries. It can apply in healthcare, in various law-making functions, business and regulatory agencies, banking and government. This applies across the board. We're working with people in various sectors to ensure that we make this accessible in their language. And if we haven't done that yet, we'll do it next because we want to help as many different areas as possible so that they can also operationalize this directly into the areas of their business.
Global Privacy Trends
FIELD: Given the incidents that we've seen in the threat landscape, what are the global privacy trends that really concern you the most today?CAVOUKIAN: You've seen the enormous increase in hacking and access to data. Let me just flag two concerns. One of the things is in response to all the external hacking that's taking place. People are tending - organizations, companies - to secure their perimeter. They're putting up the external firewalls. They're protecting the data that they hold inside from external attacks, third-party attacks from the outside. Of course you should do that, so I'm not suggesting otherwise. But sometimes they're doing this to the exclusion of strengthening the security provisions on the inside. What they forget is, in cases of identity theft, the biggest breaches and the biggest problems arrive from the inside, inside jobs done by rogue employees. That's who you have to be concerned about because they have access to personally identifiable data on the inside. It's not encrypted so they're in plain text for anyone to access. The inside rogue employee can access this data and sell it or do whatever they want, and there's an identity theft case. It's huge. So it grows very large. You have to protect both the inside and the outside, which you would think goes without saying.
Increasingly, we've also noticed that the obvious place to start with much of this is to encrypt your data so that you don't have it sitting there in plain text. When it's encrypted and it's in cipher text, at least for the casual rogue employee, it's going to be much harder for them to try to take this data, steal it and do whatever they want with it. So it does present a significant deterrent. A lot of people aren't strong there. If you look at a lot of the cases in the recent past, had they encrypted their data, a lot of the breaches that were reported would have been nonexistent. It wouldn't have mattered. The data wouldn't have been accessible without the keys.
Now, the other thing I'd like to point out is increasingly you've got a lot of biometric data: your fingerprints, facial recognition technology, things of that nature. Biometrics is becoming very, very popular. You also have to be very careful how you use your biometric data and who it's accessible to. Because if you have biometric data in plain text, if it's not encrypted or biometrically encrypted, which is even better, then when a problem arises and there's a crime or something, law enforcement may come knocking at your door. They want to access your biometric data because they have in their possession biometrics - people's fingerprints, etc. - and they want it to match. But this is a purpose that was never intended to begin with for administrative purposes. Be very careful of the growth of biometric data as well. And the last area, of course, is information going into the cloud.
Your audience may be interested in knowing that Privacy by Design has now been made an international standard last year in Jerusalem in October. Once a year, there's an annual international privacy commissioners and data protection regulators conference, usually in Europe. Last year Israel hosted it. And we unanimously passed an international resolution making Privacy by Design an international standard. This is now being adopted worldwide, not only in Canada and the EU. The FTC has made it one of its three recommended practices. And one of the bills that was recently introduced in the U.S. for a commercial bill of privacy rights by Senators Kerry and McCain, that has the language of Privacy by Design directly in it for the first time.
FIELD: And a final question for you. For organizations that are looking to improve their privacy policies and their enforcement, where is a good place to begin?
CAVOUKIAN: Of course, I'm going to send you to our website. We have two websites. If you go to PrivacyByDesign.ca, you'll find everything associated with Privacy by Design. But if you want to ensure that as an organization, in terms of what you're doing now, you increase your privacy practices, there's a wonderful website in the states, Privacy Clearinghouse. They're fabulous. They have wonderful information on how to avoid cases of identity theft, and they do an outstanding job.