A Regulator's Perspective on Privacy by Design - Future of Privacy

mundanemushroomsElectronics - Devices

Nov 21, 2013 (3 years and 10 months ago)

101 views



A regulator’s perspective
on Privacy by Design


Ann Cavoukian, Ph.D.

Information & Privacy Commissioner of Ontario, Canada


Introduction


Ira Rubenstein and Nathanial Good
1

recently argued that the approach to
Privacy by Design

(
PbD
)
adopted by regulators
has not been translated into engineering and usability principles and practices.

With due respect, I disagree. They argued that
Privacy by Design
, when properly conceived, requires
translation of the
universal privacy principles for the handling of person
al data
established by Fair
Information Practices (FIPs)
2

into engineering and usability language. On that we agree. This is precisely
what my office has been engaged in for some time.
3

Their critique of current regulatory practice concerning FIPs and
PbD

is twofold. The first criticism is
conceptual. The authors argue that FIPs and the current approach to
PbD

is based on the “notice and
choice” model of privacy, but should be extended to address "social dynamics"


namely the types of
violations that us
ers experience while using a social networking platform such as Facebook. The second
criticism is practical. The authors are concerned with what it actually means to “design privacy.” They
consider design in two ways: back
-
end software implementations (
i.e. hidden from user), and front
-
end
user interfaces (i.e. privacy user settings, notification, user consent etc.).




1

Rubinstein, I. and Good, N.
,

(2012).

“Privacy by Design
: A Counterfactual Analysis of Google and Facebook Privacy
Incidents Berkeley Technology Law Journal, Forthcoming; NYU School of Law, Public Law Research Paper No.
12
-
43.

2

FIPs was first codified in OECD (1980),
OECD Guidelines on the Protection of Privacy and Transborder Flows of
Personal Data

available online at
http://www
.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html

there are a number of
articulations of FIPS including The Canadian Standards Association Privacy Code, the Asia
-
Pacific Economic Co
-
operation (APEC) Privacy Framework, the U.S. Safe Harbor

Principles and the Global Privacy Standard.

3

Cavoukian, A.
(2012).
Operationalizing Privacy by Design: From Rhetoric to Reality
,
Office of the Information and
Privac
y Commissioner, Ontario, Canada available online at
http://www.ipc.on.ca/english/Resources/Discussion
-
Papers/Discussion
-
Papers
-
Summary/?id=1254


-

2

-


Privacy by Design

should, in the authors’ view, be analyzed using two complementary perspectives. The
first of these is privacy engineerin
g, which refers to design and implementation, while the second is
useable privacy design, which focuses on human computer interaction (HCI) research. Rubinstein and
Good draw upon the writings of Irwin Altman, a social psychologist who viewed privacy as a
n
"interpersonal boundary process," and Helen Nissenbaum, who regards privacy in terms of contextual
integrity (the observation that the norms governing the flow of information vary by specific social
contexts). The authors then illustrate the application
of their approach with reference to ten recent
privacy incidents involving Google and Facebook. They ask what these companies could have done
differently to have protected user privacy at the time of the incident had they adopted
Privacy by
Design
.

As a s
ocial psychologist myself, the authors’ paper intrigues me, but as a privacy regulator, I have
concerns with their approach. When I created
Privacy by Design
, now internationally recognized as the
gold standard in privacy protection
4
, I thought its extens
ion beyond FIPs was clear. And so I was
perplexed by the authors’ misunderstanding of
PbD

as a general reflection of FIPs


this is totally off
base. While
PbD

certainly incorporates FIPs, it goes far beyond them, considerably raising the privacy
bar. For
example, the first four principles of
PbD

are not reflected in FIPs and provide much greater
protection.

It should also be noted that
PbD

was never intended to apply strictly to software development. The
“Design” in
Privacy by Design

proposes a broad appro
ach to expressions of privacy, in a variety of
settings


information technology, accountable business practices, operational processes, physical
design and networked infrastructure. Coming at a time when many were questioning the adequacy of
FIPs, and whe
n unparalleled rates of technological advances were facilitating ubiquitous computing and
mass data storage,
PbD

and its Principles heralded a modern view of privacy protection. Our confidence
in the
PbD

approach is not simply a “common sense belief that p
rivacy [will] improve if firms ‘design in’
privacy … rather than ‘bolting it on’ at the end.”
5

It is rooted in my office’s experience and the results
reported by the organizations that

we have worked with since the 19
90s. Confidence in
PbD

is also
reflec
ted globally in having been unanimously approved as an international standard for privacy



4

On October 29, 2010, Dr. Ann Cavoukian’s concept of “
Privacy by Design


was unanimously
adopted at the
32nd
annual International Conference of Data Protection and Privacy Commissioners, a worldwide assembly of
regulators
in

what has been described as a “landmark” resolution regarding

Privacy by Design.

5

Supra

n. 1 at p 4.

-

3

-


protection by the
International Assembly of Privacy
Commissioners
and Data Protection
Authoritie
s
, in
2010.

Equally important, organizations that have adopted a
Priva
cy by Design
framework also reported that it
has reduced their development costs. The requirement to protect personally identifying information
(PII), particularly for those operating internationally, makes it especially important when building new
system
s, to do it right, the first time. Few organizations can afford the time and expense associated
with having to later rework their systems to be compliant with privacy requirements.

The Essence of Privacy


the relevance of “control”


Maintaining control o
ver one’s personal information


expressed so well in the German constitution as
informational self
-
determination


is fundamental to properly understanding the essence of privacy.
The authors argue that FIPs rest on the premise of individual control and
thus, “seem highly unsuited to
address a new class of privacy risks associated with social media and Web 2.0 services.”
6

The authors
question whether the 7 Foundational Principles of
PbD

“are of greater assistance than the FIPs.”
7

Since
the origins of FIPs

pre
-
date contemporary Web applications, it is not surprising that they may be
considered lacking. It does not follow, however, that this invalidates either the control paradigm or the
efficacy of the
PbD

Principles.

The authors assert that when usability

experts analyze the privacy implications of user interfaces, they
do not turn to FIPs as a source of understanding but instead turn to the work of Altman and
Nissenbaum. Altman is a social psychologist who studied personal space and territoriality and wh
o
conceptualized privacy as a dynamic process of negotiating personal boundaries in intersubjective
relationships.
8

Nissenbaum is a philosopher of technology who truly understands privacy in terms of
norms governing distinct social contexts, a framework sh
e refers to as contextual integrity.
9

Altman’s
work, undertaken in the mid ‘70s, pre
-
dates the Internet and Social Networking Sites (SNSs). It concerns
itself with
social/interpersonal privacy

in a non
-
web world. A great deal has changed since then. But



6

Supra

n. 1 at p.

14.

7

Supra
n. 1 at p. 6.

8

Altman, I. (1975).
The Environment and Social Behavior: Privacy, Personal Space, Territory and Crowding,
Brooks/Cole Pub. Co
.

9

Nissenbaum, H. (2010).
Privacy in Context
, Stanford Law Books.

-

4

-


mo
st important,
social/interpersonal privacy

(e.g. expectations of privacy with respect to one’s personal
relationships) differs from
informational privacy

or
data protection
.

I agree that principles of social interaction, as described by Nissenbaum in pa
rticular, yield important
insights when considering the privacy challenges posed by social media. To me, however, they
inform

our perspective on user
control

of PII, and in turn present compelling arguments for:

o

Education


helping users to understand the

privacy implications of their social media
activities;

o

Tools


enabling users to appreciate the scope of their social network and the impact of
changes to their privacy settings;

o

Empowerment


the extent to which organizations provide intuitive, yet power
ful,
controls for users to manage their own personal information.

The work of Altman and especially Nissenbaum can (and should) inform regulatory analysis
.
In

my view
,

however, while

context is critical to privacy



existing views of privacy will need to
evolve to address
user
-
generated issues raised by SNSs and Web 2.0 services,

control will
remain the cornerstone
of
informational
privacy

or data protection
.
The contextual approach to privacy complements the
empowerment of individuals to make their own d
istinct, personal choices regarding the dissemination of
their PII rather than precluding the decision
-
making capacity of individuals on the basis of a “what if” or
counterfactual analysis.

PbD

Principles


dismiss them at one’s peril


One reason why reg
ulators around the world are adopting
PbD

is because its Principles not only
embrace FIPs, but extend them


when it comes to the protection of PII,
they significantly raise the bar
.
Therefore it is truly surprising that the authors casually dismiss the
Pb
D
Principles as merely
“aspirational” and of no “greater assistance than the FIPs.”
10

Did they actually review them? While the
Principles themselves do not constitute a regulatory framework, they are, nonetheless, powerful when
invoked in a thoughtful and
serious manner. Rather than “stopping far short of offering any design



10

Supra

n. 1 at p. 6.

-

5

-


guidance,”
11

in my experience, having worked with dozens of organizations committed to their
implementation, the Principles have enabled regulators, software engineers and privacy profes
sionals to
identify and recognize the qualities that privacy protective systems must embody. In the words of one
engineer, “… I have heard of Dr. Cavoukian and the
PbD
movement, but I had never been exposed to
any details. The details were amazing, and I
like the 7 Foundational Principles…. These are sound
principles that make a lot of sense.”
12

Ultimately, they offer considerable guidance. One need look no
further than the first four Foundational Principles to understand how
PbD

builds on but significantly

exceeds FIPs:

1.

Proactive not Reactive; Preventative not Remedial


this principle is crucial to the
essence of
PbD
. Within organizations that have embraced it, one observes a
commitment to set and enforce high standards of privacy. They develop processes

and
methods to recognize poor privacy design, to anticipate poor privacy practices and to
correct negative impacts as quickly as possible. Most important, they lead with strong
privacy deliverables anticipated, right from the outset. The goal is to preven
t the privacy
harm from arising.

2.

Privacy as the Default Setting


speaking both as a social psychologist and as a
Regulator, I have observed the power of the default


“the default rules.” The authors
themselves seem to concur on this point with a discussi
on of Feigenbaum’s work
regarding “customizable privacy” within the context of an ideal approach to Digital
Rights Management.
13

Whatever setting is automatically offered within a given system,
that is the setting that will prevail. Accordingly, we would l
ike privacy to be featured as
the default.

3.


Privacy Embedded Into Design


this is the ultimate goal, offered through the
development and implementation of a systematic program to ensure the thorough
integration of privacy, within all operations. For exam
ple, if privacy is embedded in the
architecture of an IT system


into its very code, other privacy assurances will be far
more likely. Such a program should be standards
-
based and amenable to review and
validation.




11

Supra
n
. 1 at p 6.

12

Burton, C. (2012). “International OASIS Cloud Symposium” available online at
http://blogs.kuppingercole.com/burton/2012/10/17/2012
-
i
nternational
-
oasis
-
cloud
-
symposium
-
2/
.

13

Supra

n. 1 at p. 21

-

6

-


4.

Full Functionality


Positive
-
Sum, not Ze
ro
-
Sum


this principle perhaps represents
PbD
’s
greatest strength, and perhaps its greatest challenge, but it is far from being
“unrealistic.”
14

In fact, I take particular objection to the authors’ suggestion that
businesses do not care about privacy req
uirements, in the face of business needs. This
is not the case across the board. Smart business leaders realize that privacy is inherently
good for business. Successful sales professionals understand that, all things being equal,
people buy from organiza
tions that they like
and
,
most important, trust. Marketing
professionals like Seth Godin, the creator of “permission
-
based marketing,” recognize
the long
-
term value of customers who have
volunteered

to participate in marketing
campaigns.
15

In addition to
helping avoid breaches, positive
-
sum approaches encourage
innovative solutions to business challenges (which translates to gaining a competitive
advantage), both helping to retain existing customers while attracting new ones. Far
from encountering resista
nce, virtually every organization that I have met with has
readily embraced this Principle, regardless of the additional effort required.

I agree with the authors regarding the importance of user interface design. Principle 7


Respect for
User Privacy


Keep it User Centric


deals with precisely this area and should not be dismissed as a
mere “summing up of the earlier Principles
.”
16

Like the authors, I respect the value of front
-
end design
and its importance in illustrating and satisfying the user’s pri
vacy expectations. It is critical to ensure
that a “user
-
centred design seeks to develop software and software interfaces that are focused around
end
-
user goals, needs, wants and constraints.”
17

If being proactive represents the essence of
PbD
, then
respe
ct for users is among its primary motivations.




14

Supra

n. 1 at p. 6.

15

Godin, S. (1999).
Permission Based Marketing: turning

Strangers Into Friends And Friends Into Customers
, Simon
and Schuster

16

Supra

n. 1 at p. 7

17

Supra n. 1 at p. 32

-

7

-


Operationalizing
PbD



A Broad Spectrum of Current Applications


The authors rightly point out that regulators “must do more than merely recommend the adoption and
implementation of
Privacy by Design
.”
18

I co
uldn’t agree more. We have tried to do just that over the
years. As useful as the Principles may be, I agree that much work remains to be done to make them
more broadly actionable. The development of clear, specific guidelines for applying the Principles,

as
well as providing oversight of
PbD
-
based implementations, is indeed necessary.

Application


Organizations have begun to undertake significant
PbD
-
based implementations. The first step is to build
a wide range of experience and then assess lessons lear
ned


we have attempted to do just that.
Working with a variety of diverse organizations, we have documented a variety of
PbD

implementations
in 9 different application areas, including:

1.

Surveillance cameras in mass transit systems
19

2.

Biometrics used in cas
inos and gaming facilities
20

3.

Smart Meters and the Smart Grid
21

4.

Mobile Communications
22

5.

RFIDs
23




18

Supra
n. 1 at p. 67

19

Cavoukian, A. (2008).
Privacy and Video Surveillance in Mass Transit Systems: A Special Investigation Report
MC07
-
68

Office of the Information and Privacy

Commissioner, Ontario, Canada available online at
http://www.ipc.on.ca/images/Findings/mc07
-
68
-
ttc_592396093750.pdf


20

Cavoukian, A. (2010).
Privacy
-
Protective Facial Recognition: Biometric Encryption Proof of Concept
. Office of the
Information & Pri
vacy Commissioner of Ontario available online at
http://www.ipc.on.ca/English/Resources/Discussion
-
Papers/Discussion
-
Papers
-
Summary/?id=1000


21

Cavo
ukian, A.

(2011).
Operationalizing Privacy by Design: The Ontario Smart Grid Case Study
. Office of the
Information & P
rivacy Commissioner of Ontario available online
http://www.ipc.on.ca/english/Resources/Discussion
-
Papers/Discussion
-
Papers
-
Summary/?id=1037


22

Cavouk
ian, A., and

Prosch, M. (2010).
The Roadmap for Privacy by Design in Mobile Communications: A Practical
Tool for Developers, Service Pro
viders, and Users
. Office of the Information &
Privacy Commissioner of Ontario
available online at
http://www.ipc.on.ca/images/Resources/pbd
-
asu
-
mobile.pdf


-

8

-


6.

Near Field Communications
24

7.

Redesigning IP Geolocation
25

8.

Remote Home Health Care
26

9.

Big Data
27

Engineering


To emphasize the important role that engineering plays in pri
vacy protection, I called 2011 “The Year of
the Engineer.” I spoke almost exclusively to groups of engineers, programmers, developers, and code
-
writers around the world


2,000 engineers at Adobe’s Annual Tech Summit alone


and was delighted
by their resp
onse to
Privacy by Design

and the 7 Foundational Principles! Their affirmation of the value
of
PbD

and its “doability” convinced me that we were proceeding in the right direction. No one said that
PbD
couldn’t be done, from an engineering perspective


da
ta protection could clearly be embedded
into the design of information technologies, systems and architecture.

I recently released a new paper entitled, “
Operationalizing Privacy by Design: A Guide to Implementing
Strong Privacy Practices
,” that outlines
the process of “systematizing and summarizing the design
principles of
PbD
.”
28

In addition, I am co
-
chairing a Technical Committee (TC) with Dr. Dawn Jutla, professor of engineering at
St. Mary’s University, and winner of the prestigious U.S. World Technolo
gy Award (IT Software), of the
standards body OASIS (the Organization for the Advancement of Structured Information Standards). The






23

Office of the Informatio
n & Privacy Commissioner of Ontario. (2006).
Practical Tips for Implementing RFID
Information Systems (RFID Privacy Guidelines)
. Office of the Information & P
rivacy Commissioner of Ontario
available online at
http://www.ipc.on.ca/images/Resources/up
-
rfidtips.pdf


24

Cavoukian, A. (2011
). Mobile Near Field Communications (NFC) "Tap 'n Go"
-

Keep it Secure and Private
. Office of
the Information and Privacy Commissioner Ontario, Canada
availa
ble online at
http://www.ipc.on.ca/images/Resources/mobile
-
nfc.pdf


25

Ibid

26
Cavoukian, A., and Rossos, P.G.

(2009).
Personal Health Information: A Practical Tool for Physicians Transition
ing
from Paper
-
Based Records to Electronic Health Records
.
Office of the Information & P
rivacy Commissioner of
Ontario available online at
http://www.ipc.on.ca/images/Resources
/phipa
-
toolforphysicians.pdf


27

Cavoukian, A., and

Jonas, J. (2012).
Privacy by Design in the Age of Big Data
. Information and
Privacy
Commissioner of Ontario available online at

http:/
/www.ipc.on.ca/images/Resources/pbd
-
big_data.pdf


28

Supra
n. 3.

-

9

-


TC is called
PbD
-
SE (Software Engineers) documentation, and is intended to develop concrete standards
for
PbD

in software en
gineering. Interested parties are invited to join.

Educat
ion

and training


I am also pleased to note the work of my valued colleagues who are also operationalizing
PbD

through
their development of educational programs and materials:

1.

Professor Daniel Solove

has just released an excellent new training module
(
www.teachprivacy.com
), which discusses how to implement
P
rivacy
b
y
D
esign

and targets
software engineers as well as the designers of programs and services.

2.

At

Carnegie Mellon University, Professors Lorrie Faith Cranor and Norman Sadeh have
developed a new Master’s program in “Privacy Engineering.” A major element of the program is
a
PbD

“learn
-
by
-
doing” component.

These initiatives represent concrete steps ta
ken toward operationalizing
PbD

and making its
implementation part of the default rules for the next generation of privacy professionals, who will be
tasked with responding to the new privacy challenges that we will invariably be facing.

Conclusion


As a

regulator, I am I required to apply the law and regulations in a consistent manner. The principles
that I advocate must necessarily be applicable to the broadest possible constituency, allowing
organizations to find their own solutions. The research ref
lected in Rubinstein and Good’s paper draws
from a narrower selection of cases. The authors introduce interesting ideas through an analysis of ten
cases drawn from two organizations. But this provides neither the insight that developers and
application o
wners need nor the broad set of guidelines that would be useful in a regulatory context.

The authors have made a valuable contribution insofar as they push the boundaries of traditional
approaches to thinking about privacy. Issues such as cultural norms,

social expectations and situational
ethics are increasingly important considerations. While I welcome the authors’ scholarly contribution to
research on
Privacy by Design
, much more work is necessary. I urge readers to compare the
-

10

-


Foundational Principles

of
Privacy by Design
with the privacy principles contained in Fair Information
Practices


they are not one and the same. While the latter are subsumed under
PbD, Privacy by Design

significantly raises the privacy bar, which will be needed in the future w
orld of Web 2.0, 3.0, and
beyond.