List of Streams - version 19 February 2013

zurichblueInternet και Εφαρμογές Web

21 Οκτ 2013 (πριν από 3 χρόνια και 7 μήνες)

192 εμφανίσεις


Page
1

of

41

(version
19
-
02
-
2013)




List of Streams
-

version
19

February 2013

List of Streams
-

version 19 February 2013

................................
....................

1

Stream 1
-

SNS and data protection

................................
................................
....

5

Data protection and SNSs


a tricky relationship

................................
.........

5

Dariusz Adamski, University of Wroclaw, dadamski
@prawo.uni.wroc.pl

................................
......

5

Illusion of personal data as no one’s property: reframing the data
protection discourse

................................
................................
...................

5

Nadezda Purtova, University of Groningen, n.n.purtova@rug.nl

................................
...................

5

Nudging to privacy
-

The "Privacy by Default/Design" approach in relation
to Social Network Services

................................
................................
..........

9

Max
-

R. Ulbricht, Technical University of Berlin, max
-
robert.ulbricht@tu
-
berlin.de

....................

9


Page
2

of

41

(version
19
-
02
-
2013)



Stream 2
-

Online Privacy Across Borders

................................
....................

11

Regulating online privacy across

borders

................................
...................

11

Anabela Susana de Sousa Gonçalves, Law School
-

University of Minho,
asgoncalves@direito.uminho.pt

................................
................................
................................
...

11

Impacts of the proposed European General Data Protection Regulation
(GDPR) to the Association Southeast Asian Nations' (ASEAN)

data privacy
regime

................................
................................
................................
........

13

Noriswadi Ismail, Intellectual Property, Internet and Media Research Centre, Brunel Law School,
Brunel Uni
versity, London, United Kingdom, Noriswadi.ismail@brunel.ac.uk

.............................

13

Tsunami, timidity or temerity? Recent developments in privacy an
d data
protection law in ASEAN.

................................
................................
...........

15

Li Hua Chew, Centre for Legal Pluralism and Legal Affairs, Faculty of Law, University of Malaya
janetchew@um.edu.my

................................
................................
................................
................

15

Stream 3
-

The Right to be forgotten and Internet privacy

.....................

16

Right to Be Forgotten. The Fundamental Right of the Person or the

Danger
of the “Ministry of Truth”

................................
................................
...........

16

Wojciech Rafal Wiewiórowski


Inspector General for the Protection of Personal Data; University
of Gdansk, Faculty of Law and Administration

................................
................................
..............

16

How Much Oblivion Is There to the Right of Oblivion?

...............................

16

Oleksandr Pastukhov, Univers
ity of Malta, oleksandr.pastukhov@um.edu.mt

...........................

16

Stream 4
-

Consent in data protection regulation (1)

...............................

18

Consent: The Third Objective of EU Data Protection Law?

..........................

18

Orla Lynskey, London School of Economics, o.lynskey@lse.ac.uk

................................
................

18

Personal Data and consent in the B
ig Data context: Issues today and
challenges ahead

................................
................................
........................

18

Aaron Ceross, Rijksuniversiteit Groningen, a.ceross@rug.nl

................................
........................

18

Lost in E
-
government: does the DPR provide adequate safeguards for
civilians?

................................
................................
................................
....

19

A.M. Klingenberg, Rijksuniversiteit Groningen,

a.m.klingenberg@rug.nl

................................
.

19


Page
3

of

41

(version
19
-
02
-
2013)



Dynamic Consent
-

A Model for Controlling the Use of Personal Information
................................
................................
................................
...................

21

Jane Kaye, Edgar Whitley,
Dave Lund, University of Oxford, jane.kaye@law.ox.ac.uk

................

21

Stream 5
-

Multiple Dimensions of internet privacy

................................
.

22

Understanding internet privacy: overall dynamics and socio
-
cultural
differences

................................
................................
................................
.

22

Luciano d'Andrea, Laboratorio di scienze della cittadinanza (LSC)
luciano.dandrea@scienzecittadinanza.org

................................
................................
...................

22

Employer and employee privacy related dilemmas in social media:
reflections from Estonian practice

................................
..............................

22

Seili Suder and Andra Siiabk, University of Tartu, andra.siibak@ut.ee

................................
........

22

Everyday I’m life
-
logging: privacy and data protection issues in ubiquitous
individual data collecting and sharing

................................
........................

24

Matěj Myška, Jaromír Šavelka, Masaryk University, School of Law, Institute of Law and
Technology, matej.myska@law.muni.cz

................................
................................
.......................

24

EU Cloud Flagships are sinking? Recent developments in the field of Cloud
Computing at the level of the European Union.

................................
..........

25

Alicja Gniewek, SnT, University of Luxembourg, alicja.gniewek@uni.lu

................................
......

25

Stream 6
-

Surveillance and Social Networks

................................
...............

27

The Surveillance of Social Networking and the Soci
al Value of Privacy

.......

27

Colin Bennett, Christopher Parsons, Adam Molnar, University of Victoria, BC, Canada,
cjb@uvic.ca
................................
................................
................................
................................
....

27

Privacy of communication and the leviathanian technology advances

.......

28

Jonida Milaj
-
Weishaar, University of Groningen, j.milaj
-
weishaar@rug.nl

................................
..

28

Between the international conflicting priorities of Cyber Security,
Copyright
-

and Data Protection

................................
................................
..

30

Philipp E. Fischer, SuiGeneris Consulting, pfischer@suigeneris
-
consulting.com

..........................

30


Page
4

of

41

(version
19
-
02
-
2013)



Stream 7
-

Children and Online Privacy

................................
.........................

31

Cyberbullying: a change in the concept of privacy?

................................
....

31

Albert Verheij , University of Groningen, a.j.verheij@rug.nl

................................
........................

31

Privacy policies as tools to improve consent


Do they really work for
children?

................................
................................
................................
....

32

Federica Casarosa, European University Institute, federica.casarosa@eui.eu

.............................

32

Stream 8
-

Consent i
n data protection regulation (2)

...............................

35

Two proposals that go beyond the data subject's consent, respecting online
privacy rights
................................
................................
..............................

35

Luca Bolognini, Italian Institute for Privacy, lucabolognini@istitutoitalianoprivacy.it

.................

35

Paper on legitimate interests of the data controller (lidc): is lidc a viable
alternative to data subject's consent?

................................
........................

36

Paolo Balboni, Rosario Imperiali, Daniel Cooper, Milda Macenaite, European Privacy Association,
pbalboni@europeanprivacy.eu

................................
................................
................................
.....

36

Behavioural Targeting. How to Regulate?
................................
...................

38

Frederik Zuiderveen Borgesius,

Institute for Information Law, University of Amsterdam,
f.j.zuiderveenborgesius@uva.nl

38

Stream 9
-

Alternatives to current approaches

................................
...........

39

Information and Communication Flow and Deposit Control

.......................

39

Filip Petrinec, Thomas Lenhard, Michal Gregus,

Faculty of Management, Comenius University
in Bratislava, michal.gregus@fm.uniba.sk

39

TRUST
-
EX


object
-
oriented approach to on
-
line privacy

............................

39

Augustin Mrazik, Jan Domankus, Lukas A. Mrazik, eGov Systems s.r.o., Bratislava, Slovakia
augustin.mrazik@eGovSystems.sk

................................
................................
................................

39





Page
5

of

41

(version
19
-
02
-
2013)



Stream 1
-

SNS and data protection


Data protection and SNSs


a tricky relationship

Dariusz Adamski
,
Universi
t
y of Wroclaw
,
dadamski@prawo.uni.wroc.pl

The European data protection system has never tried
to reconcile regulation with market forces.
Quite conversely, it has assumed that markets tend to malfunction and must be corrected by
administrative measures. The emergence of SNSs has been largely considered as one more
justification for (tightened) admi
nistrative intervention, pursuant to the basic paradigm, according to
which more powerful market processes may only be corrected by stiffer administrative action. The
paper will, however, offer a different perspective on the relationship between data prote
ction and
SNSs. It will, first, use recent research (the 2012 Cisco Connected World Technology Report in
particular) to highlight the significance of SNSs for the young generation. Then, partly on the basis of
the same research and partly on the experience

of the Polish market, it will explain the role of
market forces behind the SNSs and how destructive an impact of data protection regime on those
forces may be. In the next step the author will discuss some findings of the Consent project, making it
rather

clear that especially younger users of SNSs are aware of the role market forces play in the
evolution of SNSs and that they accept the concomitant trade
-
offs. What is more important, they
rationally tend to draw the line between acceptable and unacceptabl
e practices of SNSs at unfair
commercial practices. A recent Polish case of a photographer questioning Facebook’s terms on
jurisdiction before the Polish competition authority will illustrate the argument. The paper will wrap
up with a conclusion that none

of the factors has been properly recognized by the extant


bureaucratic, costly and largely redundant


data protection systems. It may be even less so in case
the Data Protection Regulation is adopted.


Illusion of personal data as no one’s property: re
framing the data protection
discourse

Nadezda Purtova, University of Groningen, n.n.purtova@rug.nl

'I fear that the ghost exists,' said Lord Canterville,
smiling […]. It has been well known for three
centuries, since 1584 in fact, and always makes its
app
earance before the death of any member of our
family.'

'Well, so does the family doctor for that matter, Lord
Canterville. But there is no such thing, sir, as a ghost,
and I guess the laws of Nature are not going to be
suspended for the British aristocracy
.'
1




Oscar Wilde,
The Canterville Ghost

Background
:




1

Oscar Wilde, The Canterville Ghost, A Hylo
-
Idealistic Romans, Branden Books, 1970, p. 5


Page
6

of

41

(version
19
-
02
-
2013)



As new technological, social, and market developments, such as cloud computing, social network
sites, and overall proliferation of personal data processing call to yet again re
-
examine the principal
foundations of the European data protection, this paper o
ffers an alternative analytical framework
instrumental to a better understanding of the options laying for the policy
-
makers.


The main thesis of the paper is that the dichotomy ‘individual ownership of personal data’ vs.
‘personal data as a public domain
’ is false. The alternative dichotomy more consistent with the
realities of data processing and economic analysis of the concept of property is ‘individual ownership
in personal data’ or ‘collective ownership of data controllers’. Put differently, if the p
olicy
-
maker
does not assign ownership in personal data and (a degree of) control over personal data disclosure
and use to the data subject, it effectively results in granting collective ownership of personal data to,
among others, information industry.


Th
e debate around property rights in personal data both in the European and American privacy
discourse is much like a plot of a ghost novel


at the onset everybody talks about it, but few believe
in it, until it is no longer possible to deny its presence. P
roposals to introduce property rights in
personal data have emerged in the United States as early as the 1970s
2
, and have been subject of
academic discussion


at times less intensive


ever since, with the European scholars joining the
debate in the early

2000s. Roughly, one part of the ‘propertization’ camp reasoned that the law
should acknowledge the de facto commodification of personal data that occurred as a result of a
switch to behavioural marketing and made personal data ‘new oil’ of the modern econ
omy; the
other part considered propertization mainly as a means of giving back to the individual control over
data pertaining to him.
3

Some arguments have been made against propertization, a predominant
anti
-
propertization argument being that informational

privacy is a common good and propertization
facilitating market exchange would not be able to secure it.
4

In response, other scholars have offered
property models consistent with and arguably enhancing informational privacy.
5


Next to the information indu
stry drawing their value from access to personal information
6

and
claiming property rights in consumer profiles and databases,
7

other business models have emerged
claiming to help individuals


for a fee


to manage and reassert their ‘ownership of persona
l data.’
8

Personal data markets and property rights in personal data have become popular subjects of
academic research. While before data protection and informational privacy conferences would have



2

See, e.g.,
Alan F. Westin, Privacy and Freedom (London, Sydney, Toronto: the Bodley Head, 1967).

3

See, e.g. Edward J. Janger, Muddy Property: Generating and Protecting Information Privacy Norms in Bankruptcy, 44
WM.
& MARY L. REV
. 1801, 1808 (2003), etc.

4

e.g. Regan, Priscila. 2002.
Privacy as a common good in the digital world.
Information, Communica
tion & Society
, Vol. 5 No.
3, and the literature referred to by Regan.

5

in the US
-

Paul Schwartz, 2004. Property, Privacy, and Personal Data.
James Rule,
Privacy in Peril: How We Are Sacrificing
a Fundamental Right in Exchange for Security and Convenienc
e

(Oxford: Oxford University Press, 2007)
,
Lawrence Lessig,
Code 2.0

(New York: Basic Books, 2006)
, a new edition of
Lawrence Lessig,
Code and Other Laws of Cyberspace

(New York:
Basic Books, 1999);
Janger,
Muddy Property
, etc.

6

e
.g. Finger, Richard, 2012. Facebook: What It’s Really Worth.
Forbes
. Issue of 30 October 2012, available online
at<http://www.forbes.com/sites/richardfinger/2012/10/30/facebook
-
whats
-
it
-
really
-
worth/>

7

BBC News.
Instagram seeks right to sell access to pho
tos to advertisers 18 December 2012
, available online at
<
http://www.bbc.co.uk/news/technology
-
20767537?print=true
>

8

e.g., personal.com (a US
-
based company offering privacy
-
respectful management of personal data on behalf of data
subjects), commodify.us (
a similar European start
-
up).


Page
7

of

41

(version
19
-
02
-
2013)



included academic papers on propertization only occasional
ly, in 2012
-
2013 in Europe alone the
growing interest in the topic is signified by several large conferences and workshops dedicated solely
or in a big part to the issues of data markets and economic value of personal data.
9

Some of this
research is EU fun
ded.
10


However, despite these developments, to the author’s best knowledge, no jurisdiction either in the
US or Europe has adopted or comprehensively considered the option to introduce property rights in
personal data. The notions of economic value and own
ership of personal data have become routine
in the realities of data processing practices and data protection and information systems
scholarship.
11

Paradoxically, the preparatory documents released on January 25, 2012 in connection
with the announced EU da
ta protection reform: the Proposal for a Regulation on the protection of
the individuals with regard to the processing of personal data and free movement of such data
12

and
accompanying Commission’s reports, Communication and Impact Assessment, do not conta
in any
considerations regarding property in personal data. Reportedly, member states have discussed the
idea of propertization at some point but the discussion has stumbled over the unconventional nature
of personal data as an object of property rights and

has led to no conclusive results mentioned in the
Reform documents.
13



The claim of this article is that one ought not delay or avoid resolving the issue of property in
personal data in the context of the current data processing realities. (Quasi
-
) proper
ty rights in the
personal data should be given to an individual. This follows from the following theses: Maintaining
that personal data is res nullus or nobody’s property and is in ‘public domain’ is an illusion not viable
in the information
-
driven economy
. Even more so, maintaining status quo where no ownership in
personal data is formally assigned equals assigning ownership to the information industry and leaving
an individual defenceless in the face of corporate power eroding his autonomy, privacy and ri
ght to
informational self
-
determination. In this respect, this paper submits that the Proposal for a new data
protection regulation is a missed a chance to change the status quo and meaningfully strengthen the
position of an individual against the informat
ion industry and therefore favours the latter.


Relevance for the context of social network sites:




9

APC2012 in October 2012 in Amsterdam, “For yours eyes only”, International privacy conference in Brussels in November
2012, Track on Personal data markets within the framework of the 2013 European Information Systems Conference, etc.

10

see summary of Wo
rkshop "Economic Value of Personal Information"at <http://www.apc2012.org/content/economics
-
personal
-
information
-
0>

11

A Google scholar search on words “value of personal information” results in 256 hits, and on “ownership of personal
data” and “ownership o
f personal information”


about 300 hits.

12

COM(2012)11 final (hereinafter referred to as “the Proposal”)

13

This follows from the (only) account given by prof. Dommering in a book review written after the Proposal had been
published: “[D]e discussie over d
eze aanpak bij de voorgenomen wijziging van de privacyrichtlijn tussen de lidstaten is
gevoerd, maar, naar verluidt, is verzand in de objectdiscussie.”
(“This discussion was lead among the member states but
disappeared in the discussion on the object [of p
roperty rights
-

NP]”) (E.J. Dommering, Property Rights in Personal Data: A
European Perspective. Proefschrift van mr. N. Purtova Tijdschrift Maandblad voor Vermogensrecht 1 2012. This book
review is, to the author’s best knowledge, the only publicly avail
able account signifying that any discussion on possible
propertization of personal data took place in the course of reform deliberations. No further details on the timing of this
discussion and whether or not and if yes, why, the idea of propertization is
rejected by the EU policy maker are available.


Page
8

of

41

(version
19
-
02
-
2013)



The context of social network sites illustrates the thesis of this paper well. Social network sites’
business models
14

are aimed at extracting economic value from personal data of the users.
15

The
providers of online services claim ownership (or alternative ‘license’) in the users’ personal data
either directly in their privacy or user policies, or indirectly, by disposing

of the data in question at
their discretion: e.g. granting access to it to various advertising companies, leaving the individual user
no option to exclude third parties
-

other than other network users


from access to and
(commercial) use of their data.
16



Roadmap of the argument:

First, the paper will specify what it understands under ‘property rights’ and ownership. The former
will be defined both in economic and legal terms: Barzel explains that property rights in economics
are ‘the individual’s abilit
y, in expected terms, to consume the good (or the services of the asset)
directly or to consume it indirectly through exchange’;
17

and legal property rights are the economic
property rights granted state protection.
18

The rationale of both legal and economic property rights is
to exclude others from a resource. Simultaneously, if a holder of an entitlement can actually
consume a ‘good’ in question, the absence of a legal property right in an object does not exclude a d
e
facto existence of an economic property right. Economic property rights, even without the
recognition of the law, may also be self
-
enforced.


Second, the paper will review the classification of ‘goods’ in law and economics offered by, among
others, Ostr
om and Hess:
19

commons, common pool resources, and common property goods
(unexcludable) versus private and club goods (excludable). The argument will be made that although
a single data collector cannot exclude other data collectors from obtaining same piec
es of personal
data from data subjects or other sources, the Information industry as a whole, in the context of the
current data processing practices, excludes data subjects from full enjoyment of their personal data
and therefore the latter cannot be rega
rded as a common good.


Subsequently, the article will review available empirical research on allocation of property rights, e.g.
a theory developed by Umbeck, to support a thesis that, where a resource has economic value and
public assignment of entitleme
nts in the resource is absent or unenforced, private actors will
distribute entitlements in the resource proportionate to the ability of each actor to apply ‘force’, i.e.
enforce his claimed property right.
20





14

With an exception of a small amount of non
-
profit oriented social networks.

15

Facebook and Instagramm cases referred to earlier.

16

E.g. most recent Facebook privacy settings account only for the possibilities of denying access to individuals, via Facebook
itself or search engines, rather than businesses.


17

Alchian & Allen, Exchange and Production, 2nd edn (1977),

p. 114, cited in
Ba
rzel, Economic Analysis of Property Rights
,
p. 3

18

Alchian & Allen, Exchange and Production, 2nd edn (1977),

p. 114, cited in
Barzel, Economic Analysis of Property Rights
,
p. 3

19

Hess, Elen and

Ostrom, Elinor. 2003. Ideas, Artefacts, and Facilities: Inform
ation as a Common Pool Resource.
Law and
Contemporary Problems
. Vol 66 Winter/Spring 2003 nos 1 and 2.

20

Umbeck, John. 1981. Might makes rights: A theory of formation and initial distribution of property rights.
Economic
Inquiry
. Vol XIX. January 1981.


Page
9

of

41

(version
19
-
02
-
2013)



As a final part of the argument, the paper will

analyse the current 1995 Data protection directive and
the 2012 proposal of the general data protection regulation to conclude that the clear allocation of
entitlements in personal data is absent in both documents. Implications of such a regulatory gap fo
r
the principle of informational self
-
determination will be analysed.


Nudging to privacy
-

The "Privacy by Default/Design" approach in relation to
Social Network Services

Max
-

R. Ulbricht
,
Technical University of Berlin
,
max
-
robert.ulbricht@tu
-
berlin.de

In January 2012 the European Commission published a proposal for a legal regulation with regard to
privacy and data protection. This regulation should harmonize the different data protection laws of
all member states, which all are different implementations of the rules of the directive 95/46/EC.
Because of the nature of these rules as a directive the member states have the freedom to
implement the rules differently to their own needs but in a given f
rame. This results in divergences in
the enforcement of these rules, so that there is a need to harmonize it. In article 23 of the proposal
there is a bunch of four rules entitled with "data protection by design and default". These rules
should provide a
basis to force data controllers to meet the rules of the regulation right in the design
and implementation phase of systems for data processing and ensure that the rights of the data
subject are protected. Furthermore all implemented procedures have to be
designed in such a way
that only necessary data for a specific purpose are processed. Moreover these procedures have to
ensure that by default personal data are not made accessible to an undefined number of individuals.
In relation to Social Network Servi
ces these rules have to be critical discussed. First of all the purpose
of the data p
rocessing on these platform can
not be clearly specified. If the platform is used for the
purpose of communication only other data have to be processed and stored than if i
t is used for
some kind of identity management or self
-
presentation. So the users of these platforms have
different requirements of what the platform itself should do with their data to meet their needs. But
from this perspective a specific purpose for dat
a processing is hard to define. Moreover if it is the
intention of a user to communicate with a public audience all his/her data have to be accessible by
an undefined number of individuals. So Social Network Services can be seen as an infrastructure
where

the user have to decide who can when and under what circumstances have access to the
his/her personal data stored on the platform. But on the other hand legal regulations are needed to
ensure that the providers of those platforms process personal data onl
y in the ways the user expect.
For that some kind of design specification should be enacted to force the platform provider to build
the infrastructure in a way that users can easily understand and control the usage of their data. The
design specification
should follow the idea of a "soft paternalism". This idea emanates that not every
individual have the ability to recognize all possible options in a specific situation to choose that one
that is the best for its needs. The user interface of Social Network
Services should be designed in a
manner that forces the user to reflect the accessibility for newly entered data and give him all
instruments for setting up his decision right in the moment of entering new data. Against this
background the term "privacy b
y default" in relation to Social Network Services shall be discussed.
Furthermore an idea of a framework for the technical implementation of that approach should be

Page
10

of

41

(version
19
-
02
-
2013)



given that describes possibilities for designing user interfaces in order to support the le
gal
regulations.





Page
11

of

41

(version
19
-
02
-
2013)



Stream
2

-

Online Privacy Across Borders


Regulating online privacy across borders

Anabela Susana de Sousa Gonçalves
,
Law School
-

University of Minho
,
asgoncalves@direito.uminho.pt


The right to privacy is a significant notion of
European law, and the development of this area is
related with decisive historical factors. The importance of the right to privacy to the Member States
is clear in article 8 of the European Convention on Huma
n Rights, article 8 of the Charter of
Fundamental Rights of the European Union and article 16 (1) of the Treaty on the Functioning of the
European Union. Seventeen years ago, the 95/46/EC Directive on the protection of individuals with
regard to the protec
tion of personal data and on the free movement on such data tried to ensure an
effective protection of the fundamental right to privacy. The 95/46/EC Directive established a
harmonized system based three principles


transparency, legitimate purpose and pr
oportionality,
because it was clear back then that the difference of data protection legislation between Member
States was an obstacle to the free flow of data and the development of the internal market.

The cross
-
border data transfer became regular and w
as facilitated by the evolution of technology.
The advent of the internet and its features as a global system of interconnected networks,
characterized by a diffuse and a global nature, has made easy the wide spreading of information
across borders, and si
mplified the establishing of contacts and the data exchange.

The protection level of the 95/46/EC Directive is applicable when the establishment of the controller
is situated in the European Union. Article 4 establishes that the applicable law that rules
the
processing of personal data is the law of the
Member State

where is situated the
establishment of
the controller

in the context of
his

activit
ies. In those situations where the controller is
established
on the territory of several Member States
,
each

e
stablishment

should
comply with the obligations
laid down by the national law
of it situation. The protection level of the Directive is also applicable in
those cases where the
controller is not established on Community territory

but uses equipment

situate
d on the territory of
a

Member State

to
process personal data



the applicable law will be the
law of that country. However the differences between the personal data protection in national laws
of the Member States is a reality and has introduced distortio
n in the internal market.

The proposed European Union Data Protection Regulation tries to eliminate the differences in the
level of protection of the
right to privacy

inside the European Union, in order to implement legal
certainty, eliminate distortions
of competition and to create conditions of trust between economic
operators and individuals that may allow digital economy to develop. Simultaneously, the proposed
European Union Data Protection Regulation enlarges the scope of the European Union data
prot
ection law to extend the application of European Union protection standards. The Regulation
proposal maintains the application of the European Union law protection when the establishment of
a controller or a processor that process personal data in the cont
ext of their activities is situated in a
Member State. However the notion of establishment is defined in a broader sense, to encompass the
effective and real exercise of activity through stable arrangements, disregarding the legal form of
those arrangement
s. Therefore, it expands the application of the Regulation proposal regime.
Furthermore, Regulation proposal expands the application of the regulation legal protection to
foreign controllers (not established in the European Union) that process personal da
ta of data
subjects residing in a European Union, when the processing activities are related to the offering of
goods or services to such data subjects in the European Union or the monitoring of their behavior. Of

Page
12

of

41

(version
19
-
02
-
2013)



course it is necessary to determine what i
s concept of
activities related to the offering of goods or
services

or
activities related to the

monitoring of European Union residents behavior
, but this is a way
of widen the protection of the
European Union residents

in a globalized world. This is impo
rtant
because the Regulation proposal establishes a set of principles relating to personal data processing,
rights of the data subjects, and obligations of controllers and processors, that will be uniform on the
European Union, and the right of any person
to receive compensation if that person suffered damage
as a result of an unlawful processing operation or an action incompatible with the regulation rules.

To an economic agent or to an individual it is important to know, their rights in situations of
cross
-
border violation of data protection. If the Regulation proposal establishes the right of compensation
for damages resulting from the violation of the rights and obligations present in the Regulation
proposal, the regulation proposal has gaps and does
n´t have rules about the nature and the
assessment of damage or remedy claimed, about the measures which a court may take to prevent or
terminate injury or damage or to ensure the provisions of compensation, the transfer of a right to
claim damages or reme
dy, the persons entitle to compensation for damages sustained personally,
the extinction of the obligation of compensation or rules of prescription or limitation. The Regulation
No 846/2007 of 11 July 2007 on the law applicable to non
-
contractual obligatio
ns (Rome II)
determines the applicable law no cross
-
border non
-
contractual obligations, arising out of tort/delict,
and the applicable law shall rule all those issues. However, according to Article 1 section 2 (g) the
non
-
contractual obligations arising ou
t of violation of privacy and rights relating to personality,
including defamation are excluded from the material scope of Rome II Regulation. A literal
interpretation of the Article would mean that despite of the uniform rules of the proposed Data
Protect
ion Regulation, including the right of claiming compensation for unlawful actions, all those
issues that Regulation proposal does not provide for would be ruled by national conflict
-
of
-
law rules
and similar situations could have different solutions. As a
result the same action incompatible with
the Regulation proposal and done across Europe, could have different solutions by the application of
different national conflict
-
of
-
law rules to issues absent of the Regulation proposal.

The easy solution is the rev
ision of the Rome II Regulation and the inclusion of a rule about violations
of privacy and personality rights, as was the European Parliament has already pointed. Until den, we
can do a historical and systematic interpretation of the rules of the Rome II
Regulation. It is clear
from the preparatory works of the Rome II Regulation that the exclusion of «non
-
contractual
obligations arising out of violation of privacy and rights relating to personality, including defamation»
was the result of the lack of cons
ensus between the Parliament and the Commission. The
Commission initial proposal had a special rule for the personality rights that was criticized by the
media lobby, fearing facing liability claims under alien legal systems and the limitation of freedom o
f
press. The final text went too far and beyond the intention of the legislator: instead of only excluding
the torts committed by the media


the source of lack of consensus


it was excluded «non
-
contractual obligations arising out of violation of privacy

and rights relating to personality, including
defamation». But it isn´t clear that all the personality rights are excluded. Disclosure of confidential
data is a violation of personality rights and may be important in unfair completion (ruled by Article 6
of Rome II) and within intellectual property rights (ruled by Article 8 of Rome II)


in that context
Rome II should apply. Recital 17, 20 and 33 make reference to personality rights. For example Recital
20 states that one of the objectives of the conflict
-
of
-
law rule relating to product liability is the
protection of consumers’ health


the protection of a personality right. Because is not clear that all
personality rights are excluded from Rome II, we propose a restrictive interpretation of the exclusion
of Article 1 section 2 (g) of the Rome II Regulation only to apply to the torts committed by the media,
according to a systematic argument and because we think that text of the Article is beyond the
interest that the exclusion was aiming to protect


the m
edia and the freedom of press. As a
consequence, to all the issues not ruled by the future Data Protection Regulation, the applicable law

Page
13

of

41

(version
19
-
02
-
2013)



should be determined by the uniform conflict
-
of
-
law rules of Rome II Regulation and that should lead
to the applicatio
n of the same law to similar questions.


Impacts of the proposed European General Data Protection Regulation
(GDPR) to the Association Southeast Asian Nations' (ASEAN) data privacy
regime

Noriswadi Ismail
,
Intellectual Property, Internet and Media Research

Centre, Brunel Law
School, Brunel University, London, United Kingdom
,
Noriswadi.ismail@brunel.ac.uk

The proposed GDPR has been the ‘eye
-
candy’ and arguably, the most colourful data protection topic
in 2
012. Thanks to the strong leadership and effort of the European Commission (EC) and its
members. Since its introduction to date, the European Union (EU) member states and stakeholders
have had mixed views, reservations and oppositions. On the one hand, the

GDPR is deemed to be
prescriptive. On the other hand, the GDPR is highly principle
-
based which leaves the stakeholders to
self
-
lead, regulate and comply. Whilst the status quo is ongoing, there is minimal discussion that
analyses and appraises the impacts

of GDPR to ASEAN. Due to this, the author attempts to provide
substantial impacts of the GDPR to ASEAN data privacy regime based on 4 strands. Firstly, it
analyses the impact of data transfers to third countries (which ASEAN, falls under this category)
and
to what extend the related Articles are considered better than the present Data Protection Directive
(DPD) 95/46/EC. Although the reform aims to simplify data transfers outside the EU, this paper
argues that such ‘simplified’ approach of transfer to AS
EAN should be cautiously made through
coordinated approach. This means, ASEAN member states should be able to establish a ‘regional
commitment approach’ with the EU member states by acknowledging the proposed GDPR approach,
to be mirrored in their omnibus
data protection legislation and sector
-
specific legislation. In order to
achieve this, the soon
-
to
-
be
-
formed ASEAN Economic Community (AEC) is the ideal platform that
may be able to formalise such commitment. This, however, can only be formalised by 2015 w
hen the
AEC is formed. As at 1st December 2012, Malaysia, the Philippines and Singapore are the ASEAN
member states that passed omnibus data protection legislation. The remaining member states are
drafting the similar legislation and reviewing its sector s
pecific legislation. Of these, Malaysia is the
only ASEAN member states that offer a comprehensive provision on data transfer (almost similar to
the DPD 95/46/EC). Given the embryonic and premature landscape of data protection laws in
ASEAN, the author ex
tends the argument by suggesting that such rule of data transfer from the
European Economic Area (EEA) to ASEAN, and vice versa, should be regionally self
-
regulated, backed
up by and independent oversight body or alternatively, consider the present approac
h that is being
led by the Asia
-
Pacific Economic Cooperation (APEC) Cross
-
Border Privacy Rules (CBPR). Secondly,
the author appraises the concept of third countries’ ‘establishment’ in the EEA (as in this context,
ASEAN data controller, data processor an
d sub
-
data processor). Amongst others, the third countries’
establishment shall also be subjected to the fine of 2% of global turnover in the event of non
-
compliance. This proposed sanction regime reflects a mirror with two faces. The first face
metaphoric
ally refers to the ‘no
-
choice
-
option’ upon ASEAN establishment to strengthening data
protection compliance in the EEA. If the status quo (of the GDPR) remains, these establishments
must be able to invest substantially by having a Data Protection Officer be
ing the key contact that

Page
14

of

41

(version
19
-
02
-
2013)



leads the compliance aspects of the GDPR. The second face, however, refers to potential divestment
by ASEAN establishments in the EEA. This shall only happen if cost, bureaucracy and excessive rules
prevail. Although this may be sp
eculative, nonetheless, the author argues that in the interest of
Europe Digital Economy agenda, a balanced test applicable to third countries establishment should
be considered, particularly to ASEAN. This is because some ASEAN establishments are governme
nt
-
controlled and due to the complex structure of its governance and shareholdings structure, certain
ASEAN member states’ data protection legislation exempts the government/public sectors. The
Philippines is the only member state that extends its applicat
ion both to private and public sectors. If
such ASEAN establishment of the former is within the EEA, the author proposes that there should be
higher standard of partial exemption to be considered. Thirdly, the GDPR sends a strong message to
ASEAN member s
tates towards considering the issue of adequacy. Such future adequacy
determination by the EC should be appropriately prioritised. At this juncture, although it may be
quite premature to determine the level of adequacy, nonetheless, the author argues that
ASEAN
should strategise its strategic blueprint to embark on this aim. There is no country in Asia (except for
Australia, demographically within the Asia
-
Pacific), which achieved the status of ‘adequate’ or as a
‘safe country’ by the EC. On this ground,
a regional based solution between the AEC and the EC
should be considered. This, however, will take time for ASEAN due to 2 contributing factors. First,
ASEAN’s hybrid legal system that shapes the region’s governance is very complex. It has mixed
influence

of the English Common Law, French, German, Spanish and Dutch Civil Laws, Customary
Laws, Syariah Laws as well as special administrative state and regional Laws. These sub
-
set of the
national private laws are segmented and arguably do not interact with dat
a protection laws,
regulations, instruments, guidance and best practices. A reform to modernise these laws, takes time
subjected to the country’s leadership and governance. Although ASEAN is very strong in
harmonisation via mutual cooperation, assistance a
nd bilateral agreements between and amongst
the member states, the author argues that the issue of adequacy of data protection between the
GDPR and ASEAN’s data privacy regime must not be left out in future negotiations. Second, there
will be divided pre
ference between the GDPR versus APEC and the Organisation for Economic Co
-
operation and Development (OECD). This is because certain ASEAN member states are also member
of APEC, which adopts the APEC Privacy Principles. Some member states, adopts the best p
ractices
that are recommended by the OECD’s Guidelines on the Protection of Privacy and Transborder Flows
of Personal Data. These adopted principles, concomitantly, accelerate harmonisation not only within
ASEAN, but also beyond. Whilst these data protecti
on principles (generally) mirror the present DPD
95/46/EC, however, its highly influential, advisory and persuasive status may not be fully reflected
within the ASEAN member states’ legislation due to political uncertainty in some member states.
Lastly, G
DPR’s regulatory tone aims to beef up coordinated enforcement between the Data
Protection Authorities (DPAs) of the member states and beyond. In a long term, ASEAN data
protection regulators and its sector specific regulators will have no choice, but to ac
commodate with
this tone. The enforcement does not only apply to commercial transactions, but also to matters
pertaining to criminal justice and mutual assistance. This could be done through coordinated
enforcement within the AEC by way of self
-
regulation
and co
-
governance model backed up by an
independent oversight body. On this strand, the author argues that such coordinated data protection
enforcement between the DPAs of ASEAN member states should be explicitly mentioned in the
omnibus data protection le
gislation and sector specific legislation. If and when necessary, and

Page
15

of

41

(version
19
-
02
-
2013)



subject to exemptions’ criteria, the DPAs and the sector specific regulators should minimise
bureaucracy towards expediting the enforcement action within ASEAN and beyond. A very hercule
an
task for ASEAN, but a worthy leadership towards data protection governance.

Tsunami, timidity or temerity? Recent developments in privacy and data
protection law in ASEAN.

Li Hua Chew
(Nur Jaanah Binti Abdullah),


Centre for Legal Pluralism and Legal
Affairs
,

Faculty of Law
,
University of Malaya
janetchew@um.edu.my

If 2012 was a remarkable year for data protection law in Europe, perhaps especially so because of
the reform package published by the European Com
mission on the 25 January 2012, it was just as
memorable in the ten members of the Association of South East Asian Nations (ASEAN). Three of its
members finalized their new data protection laws in 2012 and brought them into force in quick
succession, Malay
sia in June 2012, Philippines on the 1 January 2013 and Singapore on the 2
nd

January 2013. After more than ten years of internal debate on the subject, Malaysia was the first
country in ASEAN to bring its data protection law into force but it did not nece
ssarily religiously
follow an established pattern not did it provide a complete template for it ASEAN neighbours. The
Personal Data Protection Act 2010 was gazetted on 10
th

of June 2010 and came into full effect in
autumn 2012. Quite unusually, when compar
ed to the prevalent European and North American
approaches, the law on personal data protection in Malaysia came into being
to regulate the
processing of personal data in commercial transactions only. Almost incredibly, by the standards of
almost every oth
er data protection law across the rest of the world, the single largest users of
personal data, namely the Federal and State governments are exempt from the Act. The

scope of and
framework set up under the Act create gaping holes in the protection of citiz
ens . Furthermore they
do little to allay concerns as to how the extensive usage of social networks and other user generated
content services compromises the right to privacy of individuals. As in other countries, the numerous
SNS/UGC platforms likewise a
ppeal to Malaysian consumers with their free services and user friendly
features. In Malaysia too, the seemingly simple registration for an account and the continuous usage
of the SNS/UGC inevitably involve the collection and storage of data, including per
sonal data. The
paper will open with a comparative analysis of the main features of the new data protection laws in
Singapore, the Philippines and Malaysia. This paper then seeks to explore the concept of consent
under the Malaysian Personal Data Protectio
n Act 2010 and how it may translate in real terms when
there may be a possible breach of privacy by the data user or a third party. An evaluation of possible
available remedies for the data subject is then undertaken to assess the effectiveness of the Act
in
protecting personal data of individuals especially in the context of user
-
generated content in an on
-
line environment.





Page
16

of

41

(version
19
-
02
-
2013)



Stream
3

-

The Right to be forgotten and Internet privacy

Right to Be Forgotten. The Fundamental Right of the Person or the

Danger

of
the “Ministry of Truth”

Wojciech Rafal Wiewiórowski


Inspector General for the Protection of Personal Data;
University of Gdansk, Faculty of Law and Administration

There is no doubt in the European doctrine of law that the information autonomy, unders
tood as the
right to control information about the person revealed to the public domain, is one of the
fundamental features of the right to privacy. There is also a general agreement that the freedom of
speech and transparency of public activities shall be

also protected in Europe. Even if the coexistence
of these principles is not called “a conflict”, everybody agrees that it is not easy to balance them. The
right to be forgotten is also intervening in this coexistence as it enlarges the notion of consent
and
withdrawal of the consent to the extent which may cause the danger that George Orwell described
as the Ministry of Truth.

The speech will focus on “historical truth” where the notion “history” is not written starting with
capital “H”. It will rather ta
ke into consideration the right to be forgotten as the possible danger to
“local” history and “local” truth.
“While Got forgives and Internet does not”
, the speach will examine
the possibilities given by Article 17 of Draft Regulation which provides the da
ta subject's right to be
forgotten and to erasure. Agreeing that the person should have the right to have personal data
concerning him/her rectified and a 'right to be forgotten' where the retention of such data is not in
compliance with this Regulation, w
e will try to assess what does it mean that the data are no longer
necessary in relation to the purposes for which the data are collected or otherwise processed. When
data subjects should have the right to withdraw their consent? We will also check if the
Article 80 is
the required defense for freedom of speech, journalistic purposes (while it is harder and harder to
distinguish a journalist from an Internet author) or the purpose of artistic or literary expression.

To make the picture clearer the speech wi
ll depict the case of Wolfgang Werlé and Manfred Lauber
and their fight with Wikimedia Foundation (recalling Hamburg, Karlsruhe and New York judgments).
It will also examine the case of
Max Mosley v. Google

asking if the possible win of Mosley changes
the
Internet as a whole, and if so, will it be a change for the better. Finally, it will present the other
approach to the right to oblivion shown by Norwegian Data Protection Authority in its project
Slettmeg.no

which offers advice and guidance to people of a
ll ages who find offending material about
themselves on the Internet. Offending material might be photos published without permission, fake
profiles on different Internet services, incorrect personal information or harassment.


How Much Oblivion Is There
to the Right of Oblivion?

Oleksandr Pastukhov
,
University of Malta
,
oleksandr.pastukhov@um.edu.mt


The paper analyses the origins, nature and scope of the newly proposed right to be forgotten and to
erasure (or, as it is often referred to, the right to oblivion) as provided for by the draft EU Regulation

Page
17

of

41

(version
19
-
02
-
2013)



that would replace the General Data Protection Di
rective. Dissenting with those who interpret the
new right as the right to erase history, the paper starts with drawing lines of demarcation between
the three different legal rights that become known under the name of the ‘right to oblivion’: the right
to
oblivion of the judicial (criminal) past, the right to oblivion that can be derived from the already
existing EU personal data protection rules, and the new, digital right of oblivion of the type created
by Spanish courts. It continues by examining, one by

one, the duties of the data controller under the
new rule and exceptions thereto. The paper concludes that none of the draft Regulation’s provisions
in their current reading can possibly be seen as creating a meaningful right to oblivion of the third
type

and that only the draft Regulation’s new and, as the paper argues, fundamentally wrong
definition of personal data as “any information relating to a data subject” would give ground to
suggesting otherwise.




Page
18

of

41

(version
19
-
02
-
2013)



Stream 4
-

Consent in data protection
regulation (1)


Consent: The Third Objective of EU Data Protection Law?

Orla Lynskey
,
L
ondon School of Economics,
o.lynskey@lse.ac.uk


According to the 1995 Data Protection Directive, EU data protection regulatio
n pursues the dual aims
of protecting fundamental rights, in particular privacy, and ensuring the creation and functioning of
an internal market for personal data in the EU. The EU Charter of Fundamental Rights contains a right
to data protection which sit
s alongside the right to privacy. The content of this right to data
protection, in particular how it differs from the established right to privacy, has never been made
explicit. This paper seeks to examine whether the protection offered by EU data protecti
on
regulation differs from that offered by the right to privacy as interpreted by the European Court of
Human Rights. The conclusion is reached that although the rights to data protection and privacy
overlap, data protection also pursues additional objecti
ves to privacy protection and market
integration. It is demonstrated, by reference to the case law of the Court of Justice of the EU, that
there is some confusion regarding the precise scope and content of this newborn right to data
protection. This paper
therefore seeks to clarify its content. This paper argues that data protection
regulation and the right to data protection pursue a third objective, independent of market
integration and fundamental rights protection. It is argued that the right to data
protection also
seeks to grant individuals control over their personal data. This third objective justifies the inclusion
of rights such as the right to deletion in existing regulation but also rights set forth in the 2012
Commission Proposal such as the r
ight to data portability. However, the precise consequences of
recognising this aspect of the right to data protection remain potentially contentious. For instance,
could it be claimed that if data protection regulation aims to grant individuals control ov
er their
personal data, individuals should be able to do as they please with this data (including waive the
rights granted to them by EU law)? The limits, as well of the manifestations of this third objective,
will therefore be explored.


Personal Data a
nd consent in the Big Data context: Issues today and
challenges ahead

Aaron Ceross, Rijksuniversiteit Groningen,
a.ceross@rug.nl

As society embraced a more networked environment, individuals and organisations interact

and
perform tasks which create an immense pool of data which includes personal details as well as
behaviours. Previously, much of this data was either discarded or ignored due to maintaining costs
and technological limitations in analytical tools. Today,
these limitations are overcome and can be
utilised to provide intensely granular information about individuals or groups thereby allowing for a
degree of predictive analytics.

This Big Data concept has been hailed as a means by which organisations may gai
n deeper insight in
order to make more efficient use of finances, resources as well as better policy decisions. The tools

Page
19

of

41

(version
19
-
02
-
2013)



however collect immense amount of personal and non
-
personal data in order to give these
outcomes. This therefore raises important ques
tions as to where Big Data and predictive analytics fit
in current European data protection and privacy models, especially in commercial and health
applications. Commercial analytics have been used to market products and encourage customer
loyalty through
providing targeted ads and discounts, essentially turning personal data and habits in
a commodity; unbeknownst to the individual in most cases. In healthcare, the drive to not only
monitor habits but also create large genetic databases to mine in order to
better understand
morbidity and mortality also treat personal data as a commodity.

The issue of informed choice thus takes on a renewed importance, particularly when the data
provided is being used to produce effects for individuals who may not be entirely

aware of the
process as in commercial bargaining and in highly sensitive healthcare applications. This paper is
therefore focused on examining this development and provides an overview of the European legal
framework (both the current directive and the pr
oposed Regulation, with specific focus on scope,
enforcement mechanisms and addressing the fundamental question of where the individual's
consent plays a role in being a part of this new analytical reality.


Lost in E
-
government: does the DPR provide adeq
uate safeguards for
civilians?

A.M. Klingenberg, Rijksuniversiteit Groningen,

a.m.klingenberg@rug.nl

Digitalizing administrative procedures is a goal the government wants to achieve. The expectations
are, among others, that digitalizing will increase efficiency and reduce costs for the government.
While digitalizing, government bodies are processing perso
nal data. This is necessary for compliance
with whether a legal obligation, or for the performance of a task carried out in the public interest or
in the exercise of official authority vested in the controller. At the same time, government is trying to
imp
rove its level of service by combining the data which are collected for different purposes. On the
one hand they claim this is beneficial for citizens, as there is no need for them to provide information
again and again to the government. This reduces the
amount of time people have to spend sending
information to government bodies. On the other hand, the benefits for government bodies are that
once they are holding the data, they can be used for all public tasks the government body has. For
data subjects,
it is almost impossible to refuse transferring personal data to government bodies. The
consequences of a refusal to transfer data will be that applications to e.g. a building permit will be
denied because the authority has too little information to make a
well balanced decision. This leads
to the consequence, that in order to claim their rights, civilians are under an obligation to transfer
their data. This makes the public sector different to the private sector, where consumers more or less
have a choice
in whether to engage in business with a company. The Dutch government is drawing
up a system of so called ‘basic registrations’. One of the basic assumptions in this system is, that the
data in the registries are ‘right’ and that all connected government
bodies are obligated to use the
data in the registration. This might be a good starting point, however it can also lead to the
consequence as we have seen in the case of Romet v The Netherlands. In this case, the driving
license of applicant had been sto
len in November 1995, and in March 1997 applicant was issued with

Page
20

of

41

(version
19
-
02
-
2013)



a new driving license. In the intervening period, the Government Road Transport Agency registered
1,737 motor vehicles in the name of the applicant in the vehicle registration system. The
registrations were effected upon presentation of applicants stolen driving license and without his
consent. This resulted in large numbers of motor vehicle tax assessments, on prosecutions of the
Motor Liability Insurance Act, and fines in respect of offe
nces committed with the cars. The Agency
refused to retroactively cancel the registrations. According to the Agency this would be detrimental
to the reliability of the motor vehicle registration system, lead to legal uncertainty and would entail
the Agency
’s interference with competencies of other authorities, e.g. the Public Prosecution Service
or the Tax and Customs Administration, in that it could affect the legality of decisions which those
authorities had made or might make on the basis of the contents

of the motor vehicle registration.
The Dutch law at that time, provided that a driving license could only be declared invalid until a
replacement was requested. The Court took the view that the failure to invalidate the applicant’s
driving license as so
on as he reported it missing, which made abuse of the applicant’s identity by
other persons possible, constituted an interference with the applicant’s right to respect for his
private life. The Court found that this interference was not necessary in a demo
cratic society. From
the day that applicant reported his driving license stolen, the authorities could not be unaware of the
fact that whoever had applicant’s driving license, was someone other than the applicant. Of course,
the main problem for applican
t in this case is the result of an act that is not well considered (and
changed during this case). The main question in digitalizing government however is how to prevent
this ‘Kafkaesque’ situation, where civilians can end up finding themselves lost in e
-
g
overnment
registrations. In this paper, I am going to explore whether the draft General Data Protection
Regulation (DPR) provides for enough safeguards for citizens in this situation. Therefore, I first have
to assess whether the DPR applies to this sit
uation. Art. 2 DPR determines the material scope of the
Regulation. Section 2, sub a, of this article provides that the regulation does not apply to the
processing of personal data in the course of an activity which falls outside the scope of Union law, in

particular concerning national security. The same restriction on material scope is provided for in art.
3(2), Directive 95/46. This directive is at present the basis for data protection regulation within the
EU member states. In case law based on this dir
ective, the ECJ decides from § 42 that ‘Moreover, the
applicability of Directive 95/46 to situations where there is no direct link with the exercise of the
fundamental freedoms of movement guaranteed by the Treaty is confirmed by the wording of Article
3(
1) of the directive, which defines its scope in very broad terms, not making the application of the
rules on protection depend on whether the processing has an actual connection with freedom of
movement between Member States.,(…).’ Secondly, I am going
to explore when processing is
necessary in order to comply with a legal obligation or carried out in the public interest. Are there
any restrictions to this grounds? Or can governments decide to process data limitless, because
government authorities are
-
f
rom nature
-

always carrying out tasks in the public interest. Thirdly in
a considerable extent, it is important that the rights of the data subjects are very well established.
How far reaching are these rights, and how far reaching are the possibilities
for governments to limit
these rights on the ground of art. 21 DPR? Finally, a conclusion will be drawn whether the rights of
citizens are safeguarded in the DPR.



Page
21

of

41

(version
19
-
02
-
2013)



Dynamic Consent
-

A Model for Controlling th
e

Use of Personal Information

Jane Kaye, Edgar

Whitley, Dave Lund
,
University of Oxford
,
jane.kaye@law.ox.ac.uk

Broad consent has been adopted as a practical solution for biobanks as it has been impossible to
apply the requirements of informed consent as a
rticulated in the Declaration of Helsinki. This is
because consent to involvement in a biobank must be obtained before the biobank commences
when all of the researchers and research users are not known. Re
-
consent maybe necessary, unless
the information
can be anonymised for use by researchers or a legal exemption is obtained. Broad
consent is strongly contested in the bioethics literature. In the EnCoRe project we have built a
patient IT interface which uses a ‘dynamic consent’ approach. In this model
, consent is not a mere
communication exercise but a bidirectional, ongoing, interactive process between patients and
researchers. Through the interface individuals can make and express preferences about the choices
they are given about the use of their da
ta and samples for research. The benefit of this interface is
that it enables individuals to exercise their autonomy by giving informed consent for new types of
research in real time rather than being asked to give a broad consent at the beginning of the
r
esearch process when they are recruited into a biobank. The benefits for the research process are
that recruitment is easier, less costly and more efficient; the legal and ethical requirements of
consent can be met with ease; there is greater transparency

and accountability in the research
process and research findings can be returned to research participants as part of a personalised
medicine approach. Dynamic consent has the potential to enhance patient confidence and enable
long term patient
-
researcher
collaborations in research. This interface moves away from manual,
paper

based processes to an e
-
governance system. We anticipate that the ‘dynamic consent’
interface will become an essential and sustainable component of research infrastructure and will
further advance translational research initiatives. In this paper we present the dynamic consent
model and show how it can be used as a tool for translational research and personalised medicine.




Page
22

of

41

(version
19
-
02
-
2013)



Stream 5
-

Multiple Dimensions of internet privacy


Und
erstanding internet privacy: overall dynamics and socio
-
cultural
differences

Luciano d'Andrea, Laboratorio di scienze della cittadinanza (LSC)
luciano.dandrea@scienzecittadinanza.org

The demand for protection and security in contemporary societies is strongly connected to the huge
increase in social subjectivity, i.e., the capacity of people to generate new ideas, representations and
models of action, freeing themselves, at least parti
ally, from the influence of dominant social
structures. While fostering social richness and diversification, social subjectivity also produces risks
for people’s identity and personal security. On the web, and especially on the social networks, these
bro
ader dynamics involve both people’s privacy orientation and tendency for self
-
disclosure, i.e., the
propensity to express themselves and to reveal personal and even intimate details through words,
images and videos. Perhaps for the first time, the mechanis
ms for the protection of privacy must deal
not only with "outside" attempts to intrude on the privacy of individuals, but also the tendency of
these same individuals to self
-
disclose in more or less pronounced and intimate ways. The paper
explores the rel
ationships between privacy and self
-
disclosure proposing a conceptual model and
providing an analysis of the main trends in Europe based on both official statistics and data from an
online survey carried out under the CONSENT project. National trends and d
ifferences by age and
gender will also be analysed. Some policy criteria aimed at balancing privacy and self
-
disclosure on
the web will be finally discussed.


Employer and employee privacy related dilemmas in social media: reflections
from Estonian practi
ce

Seili Suder and Andra Siiabk
,
University of Tartu
,
andra.siibak@ut.ee

This interdisciplinary presentation sets out to examine various ethical and legal dilemmas the present
day employers and employees face when
communicating in “networked publics” (boyd 2008). We
rely on the findings of a small
-
scale qualitative study with Estonian employees (Visamaa 2011) and
real
-
life examples from the practice for introducing the issues that have arisen on the topic. In the
recent years, employment relationship has gained a new and pressing dimension


social networking
site profiles, personal blogs and tweets have made private information become easily accessible to
the general public. In fact, the users of social media have

still not quite grasped the idea about the
omnopticon of social media (Jensen 2010), or the fact that our interactions on online platforms tend
to be public
-
by
-
default and private
-
through
-
effort (boyd & Marwick 2011). Furthermore, due to the
“context coll
apse” (Marwick & boyd 2010: 9) on social media, users are only slowly starting to
become aware of the fact that one’s personal issues and statements made in a specific online
context can now be visible not only to one’s “ideal audience” (Marwick & boyd 201
0) i.e. family and
friends, but also to “nightmare readers” (Marwick & boyd 2010) i.e. employers, colleagues,

Page
23

of

41

(version
19
-
02
-
2013)



recruiters, clients, etc. Nevertheless, a writer of a personal blog or owner of a social media profile
often still relies on a hope that the viewe
rs of their private information share similar norms of
contextual integrity (Nissenbaum 1998). These expectations, however, are not always met by
businesses and guaranteed by law. In fact, present day employees are more and more often
disciplined, dismisse
d and discriminated because of their behavior in online social networks. Courts,
lawmakers and law practitioners around the world are having trouble figuring out how to approach
and solve the possible privacy issues concerning social media speech and emp
loyment relationship.
Semi
-
structured interviews with Estonian employers (N=9) indicate, that employers have become
accustomed to using social media, social networking sites in particular, for performing non
-
formal
background checks for the job applicants,

for head hunting and in the context of employer branding
(Visamaa 2011). Although the interviewed employers agreed that the use of online social networking
sites for recruitment may have its threats and deficiencies, they were not very concerned about the

legal and ethical aspects such a practice may have on a company. In the USA organizations
generally have a legal right to access and monitor employees’ online activities, especially for work
-
related reasons, and evaluate job candidates on the basis of
information gathered from social media
profiles (Abril, Levin, Riego 2012). As a result of these rights, employee monitoring and surveillance
has become a common practice in USA. In the recent years this practice has been hindered by The
National Labor Rel
ations Board that guarantees employees the right to selforganization and to
engage in concerted activities for the purpose of mutual aid or protection. Social media surveillance
and broad social media policies can restrain employees from engaging in protec
ted concerted
activities and can therefore be unjustified and illegal. Matters are made more complex because The
National Labor Relations Board does not protect employees whose conversation in social media does
not relate to the terms and conditions of the

employment and seek to involve other employees in
issues related to employment. European Union on the other hand tries to be more protective for its
employee privacy, but is jet to succeed when taking into account the development of the internet
and age

of social media communication. As a result of raising awareness the European Commission
has focused its attention into modernizing the present legal framework directive 1995/46/EC on the
protection of individuals with regard to the processing of personal
data. But here lies again the
question how to balance and implement the employees’ and employers’ rights without going too far
with the protection. In addition to the employee rights and privacy, one also needs to acknowledge
the fact, that the security
and reputation of a company can also be hindered and jeopardized by one
foolish comment made by the employee on social media. In the recent years, there has been various
cases in Estonia where the inappropriate and insulting posts made by the employees of
an
organization on their personal SNS profiles have taken a form of a media scandal and not only have
left an negative impact on the organization, but also have resulted in the employees getting fired
(Wadowsky 2012) or being transferred to another job (Pu
uraid 2012). Estonian Government Office
(Government Communication Handbook 2011) together with various other organizations have
therefore started to form policies and compile guidebooks so as to regulate the situation. The
presentation examines all the f
orementioned dilemmas by exploring the practical issues that have
arisen and analyzing how the interested parties are handling the situation.

References


Page
24

of

41

(version
19
-
02
-
2013)



Abril, P., Levin, A., Riego, A. (2012). Blurred Boundaries: Social Media Privacy and the Twenty
-
Firs
t
-
Century Employee. American Business Law Journal 49(1), 63
-
124

Goverment Communication Handbook (2011). Tallinn: Goverment Office.
https://valitsus.ee/UserFiles/valitsus/en/government
-
office/government
-
communication/Valitsuskommunikatsiooni%20k%C3%A4sira
amat_ENG.pdf

Linaa Jensen, J. (2010). “The Internet omnopticon


Mutual surveillance in social media,&edquo;
paper presented at Internet research 11.0: Sustainability, Participation, Action (Gothenburg,
Sweden).

Marwick A. E. & boyd, d. m. (2010). “I tw
eet honestly, I tweet passionately: Twitter users, context
collapse, and the imagined audience,” New Media & Society 13(1), pp. 114

133.

Nissenbaum, H. (1998). Protecting Privacy in the Information Age: The Problem of Privacy in Public.
Law and Philosophy

(17), 559
-
596.

Puuraid, P. (2012). Faecbookis oma tööst rääkinud haiglaõde viidi teise osakonda.
http://www.epl.ee/news/eesti/facebookis
-
oma
-
toost
-
raakinud
-
haiglaode
-
viidi
-
teise
-
osakonda.d?id=64666352

Visamaa, K. (2011). Veebipõhiste sotsiaalvõrgustike

kasutamine töötajate värbamisel.
Bakalaureusetöö. Ajakirjanduse ja kommunikatsiooni instituut. Tartu Ülikool Wadowsky, S. (2012).

Anni Arro „Komeedi“ töötajatest šokis: lõpetasin nendega töösuhte.
Kättesaadav:
http://publik.delfi.ee/news/inimesed/anni
-
ar
ro
-
komeedi
-
tootajatest
-
sokis
-
lopetasin
-
nendega
-
toosuhte.d?id=64115091 (13.06.2012)


Everyday I’m life
-
logging: privacy and data protection issues in ubiquitous
individual data collecting and sharing

Matěj Myška, Jaromír Šavelka
,
Masaryk University, Schoo
l of Law, Institute of Law and
Technology
,
matej.myska@law.muni.cz




Life
-
logging could be characterized as a ubiquitous capturing of the life of an individual. The idea of
creating an organized searchable al
l
-
encompassing database of personal information was
represented by Bush’s famous Memex machine.[1] With the development of the computing
technology, the idea was further pursued from the year 1998 by Gordon Bell’s Project MyLifeBits.[2]
Nowadays, this rath
er time and money consuming experiment has become an everyday reality with
the availability of self
-
tracking devices and applications that also enable the sharing of the amassed
data via social networking services.[3] In this paper we firstly introduce an
d explore the historical
development of life
-
logging. Next, based on the recent The European Network and Information
Security Agency report on life
-
logging, we briefly explore the psychological and sociological issues
and risk relating this activity.[4] Ev
en though O’Hara et al. claim that „there is no rush into
regulation“[5] the focus of this paper lies in the regulatory aspects of life
-
logging. We address the
raised issues from the private and public perspective. The first subpart deals with the question
s
related to the protection of the individual and other individuals that are being “drawn” into other’s

Page
25

of

41

(version
19
-
02
-
2013)



lifelogs. In this context the proposed “right to be forgotten”[6] should be discussed especially with
relation to its practical feasibility. From the pu
blic perspective life
-
logging means also a
democratization of surveillance (and therefore it is sometimes dubbed “sousveillance”). Therefore
the regulatory issues as regards to necessity of life
-
logging are sketched out. The created lifelog could
be potent
ially an object of high interest for the national Law Enforcing Agencies as it basically
constitutes a sort of “private data retention”.[7] Thus the questions of the obligatory creation and
access of the state to the private life logs are presented. The ar
ticle concludes with a general
assessment of the concept of lifelogging against the contemporary EU data protection framework.

[1] BUSH, Vannevar. As we may think. Atlantic Monthly. 1945. Volume 176, p. 101

108. Available
online at:
http://www.theatlantic.com/magazine/archive/1945/07/as
-
we
-
may
-
think/303881/


[2] BELL, Gordon and GEMMELL, Jim. Foreword by Bill GATES. Total recall: how the E
-
memory
revolution will ch
ange everything. 1. print. New York: Dutton, 2009. ISBN 05
-
259
-
5134
-
2.

[3] E.g. the GPS enabled Memoto camera that captures a photo every thirty seconds and enables
sharing of it. http://www.memoto.com/. See also: Quantified Self. Available at:
http://quantifiedself.com/about/


[4] EUROPEAN NETWORK AND INFORMATION SECURITY AGENCY. To log or not to log? Risks and
benefits of emerging life
-
logging applications. Edited by DASKALA, Barbara. 2012. Available at:
http://www.enisa.europa.eu/activities/risk
-
management/emerging
-
and
-
future
-
risk/deliverables/life
-
logging
-
risk
-
assessment/to
-
log
-
or
-
not
-
to
-
log
-
risks
-
and
-
benefits
-
of
-
emerging
-
life
-
logging
-
applications
.

[5] O’HARA, Kieron, Mischa M. TUFFIELD and Nigel SHADBOLT. Lifelogging: Privacy and
empowerment w
ith memories for life. Identity in the Information Society. 2008, Vol. 1, No. 1, p. 155
-
172. ISSN 1876
-
0678. DOI: 10.1007/s12394
-
009
-
0008
-
4. Available at:
http://www.springerlink.com/index/10.1007/s12394
-
009
-
0008
-
4. p. 169.

[6] As proposed in the Art. 17
of the proposed Data protection regulation [2012/0011 (COD)].

[7] As currently effective in EU pursuant to the Directive 2006/24/EC.



E
U Cloud Flagships are sinking?
Recent developments in the field of Cloud
Computing at the level of the European Union.

Alicja Gniewek
,
SnT, University of Luxembourg
,
alicja.gniewek@uni.lu


C
loud Computing (CC) is not commonly recognized as a new technology. It is a new paradigm that
brings together and builds on previous develo
pments such as grid computing and utility computing.
CC has such unique qualities as location independence, virtualization and resource sharing. Services
provided by means of Cloud Computing (e.g. SaaS Software as a Service, PaaS Platform as a Service,
Iaa
S


Infrastructure as a Service) are available on pay
-
as
-
you
-
go basis that allow for decrease of ICT
costs in public and private bodies and faster time to market for SMEs and start
-
ups. Studies have
shown that on the one hand the lack of security is the mo
st common concern for Cloud users. On the

Page
26

of

41

(version
19
-
02
-
2013)



other hand the lack of trust in the Cloud services is the problem of Cloud providers. The European
Commission was working during the last two years towards achieving the Cloud Strategy that would
boost adoption of

the Cloud services in the European Union. It intends to realize the statement that
Europe is not only Cloud
-
friendly but also Cloud
-
active. As a consequence, the European Commission
has published recently Communication on its Cloud Computing Strategy


“U
nleashing the Potential
of Cloud Computing in Europe”. Three key areas were recognized: contracts, standards and
fragmentation of the digital market. They have also adopted three key actions


in the field of
contracts (model terms, standards contractual c
lauses as well as new rules on binding corporate
rules and codes of conduct), standards (map of the currently accessible standards and shaping the
ICT technical specifications, EU
-
wide voluntary certification schemes and harmonized environmental
metrics) a
s well as public procurement under the umbrella of the European Cloud Partnership.
Arrangements in the fields of contracts and standards as useful “technical” solutions seem to well
address the needs and expectations of the different actors unveiled in the

Public Consultations
conducted by the European Commission in 2011. The problem of fragmentation of the digital market
is kept outside of the “key actions”. The European Commission refers in this matter to the Digital
Agenda Actions in the field of IP, e
-
c
ommerce and finally in the area of data protection law. This
paper addresses the activities conducted by the European Commission in the field of European data
protection law in the context of the Cloud Computing market. Current European Commission’s
Comm
unication has pointed at the Article 29 Working Party Opinion as the transitory document that
should be used in interpreting the provisions of the Data Protection Directive. The paper compares
the current Data Protection Directive (the paper takes into acc
ount the mentioned above Article 29
Working Party Opinion on Cloud Computing) with the Proposal for the General Data Protection
Regulation. It examines the amendments of such important for Cloud Computing environment
elements as basic definitions (e.g. dat
a processor, data controller, and personal data), balancing of
the rights and obligations of the actors (e.g. assuring data security, notification requirements) as well
as such concepts as privacy by design, right to be forgotten and data portability. The
paper compares
the amendments with the opinions presented in the Public Consultations (e.g. lack of certainty,
unclear rules on rights and obligations and diversified rules on data protection thanks to the different
implementation measures adopted by the M
ember States). Moreover it analyses the disadvantages
as well as advantages of the draft Regulation recognized in the Opinion of the Committee of the
Regions, the Opinion of the European Data Protection Supervisor and finally the Opinion of the
European Ec
onomic and Social Committee. In conclusions the paper sketches the trouble spots of
the current as well as proposed EU data protection framework. It provides an overview of the crucial
elements of the Cloud Computing and contrasts them with the legal pro
posals.



Page
27

of

41

(version
19
-
02
-
2013)



Stream
6

-

Surveillance and Social Networks


The Surveillance of Social Networking and the Social Value of Privacy

Colin
Bennett
, Christopher Parsons, Adam Molnar
,
University of Victoria, BC, Canada
,
cjb@uvic.c
a


This paper addresses the question of whether the evolution of an increasingly social web poses
significant challenges to theories of informational privacy and, by extension, to the international and
national legal systems based on such theories. The
main objective is to determine how the practices
of social networking websites and environments, whose raison d'
ê
tre is the facilitation of the sharing
of personal information about and by users, can be reconciled with prevailing and evolving
definitions o
f the information privacy principles and with the existing policy regimes that are based
on this framework. Social networking services have burgeoned in a relatively short time, but they
are a complicated and poorly defined phenomenon, and vary accordin
g to a number of factors. Some
are very broad in form and function (general social
-
networking environments) whereas others are
more targeted and reflective of relationships in the non
-
virtual world (e.g. Classmates.com). Some
sites are operated by discrete

corporate entities such as Twitter, whereas others are integrated
within larger corporate networks as with Flickr and Yahoo! Some expect only interactions with real
identities; others permit, or perhaps expect, pseudonymity. Social networking also varies
according
to the intensity of expected interactions: some rely on constant networking and interactivity and
others are established for specific purposes, and accessed only periodically. They also operate on a
continuum that runs from social networking webs
ites (Web 1.0) to social networking environments
or hubs (Web 2.0) to the notion of the social semantic web or “Web
-
squared” that integrates
socialization into the very infrastructure of the Internet through various standards and protocols.
This chapter e
valuates whether the broad distinction between data collectors and data users still
hold. Specifically, it first investigates whether principles such as, accountability, security, data quality,
consent and individual access and correction are, and should b
e, implemented within social
networking systems. It next looks at whether these principles, as applied through social networks’
practices, maintains, blurs, or dissolves the distinction between data collectors and users in the
contemporary data mobilizatio
n economy. The chapter concludes by evaluating the strengths and
limitations of approaches to privacy in the context of social networking: do traditional data
protection principles still hold, how do more collectivist understandings of privacy (which are o
ften
presented as a solution to the limits of legal privacy instruments) fare, or are other theoretical
approaches


such as those provided through critical analyses of surveillance


more appropriate to
the inquiry at hand? The paper is based on ongoing
documentary and interview research funded
through the Social Sciences and Humanities Research Council of Canada, and the Office of the Privacy
Commissioner of Canada.



Page
28

of

41

(version
19
-
02
-
2013)



Pr
ivacy of communication and the l
eviathanian technology advances

Jonida Milaj
-
Weish
aar, University of Groningen,
j.milaj
-
weishaar@rug.nl


Access of households to computers and internet is continuing to increase across Europe. According
to the special Eurobarometer 381, by the end of the year

2011, 64% of the households in the old
continent had access to a computer and the same percentage had access to internet. In addition,
35% of the EU citizens have access to internet via their mobile phones. This percentages are quickly
increasing from yea
r to year (see for comparison the data from the year 2010 in special
Eurobarometer 335) and there is a considerable country variation, from 89% in the Netherlands to
31% in Romania. (Eurobarometer 381, p. 10) The increase in the use of internet, the low p
rices and
the easy access, is slowly changing ways in which individuals communicate with each other.
Possibilities are created for audio or audio
-
video conversations via internet, as well as for text
-
based
communications via instant messaging (in chat room
s or social media) or via e
-
mails. The creation of
new ways of communications creates new challenges for the national public authorities in charge of
crime control. Since communication via the internet is increasing rapidly, the way investigations are
car
ried cannot be lacking behind the technology evolution and we are going to face more and more
cases of interception of internet communications. These interceptions must be done in conformity
with the rules and principles for the protection of the privacy r
ights of the individuals as regulated by
article 8 ECHR and the case law of the European Court for Human Rights (the Court). When analyzing
the provision of article 8 ECHR, it is apparent that the legislation is framed in a technology neutral
fashion. Suc
h a legislative choice gives the possibility to the law, via the means of interpretation, to
cover new concrete situations that were not yet envisaged at the time the provision was drafted.
Technology neutral laws are especially important in those areas t
hat have a direct link with the
continuous development of technology. Particularly since in those areas it is quite difficult, if not
impossible, for the legal provisions to keep up to speed with the rapid technology developments.
Although the Convention
is not specifically mentioning types of surveillance systems, the Court’s case
law has dealt with different situations. Examples include telephone conversations (Klass v. Germany),
telephone metering (Malone v. UK), voice recording (P.G.& J.H. v. UK), etc.

The way the Court has
adapted to new situations and brought them into the realm of application of article 8 ECHR shows
the dynamic character of the Strasbourg case law and underlines its determination to interpret the
Convention as a “living instrument”

able to deal with new situations.
(Tyrer v. UK, para. 31, Dugeon v.
Ireland, para. 60, Soering v. UK, para.
102) It is not surprising therefore that the Court uses the same
test in applying article 8 ECHR to interference by the public authorities with the

internet
communication of the individuals (Copland v. UK, Liberty v. UK) as it has been applying it in the
classical cases of wiretapping of fixed line telephone communications. Telephone tapping is
considered by the Court in a number of judgments as an

“interference by a public authority” with the
exercise of an individual’ s right to respect their “correspondence” and their “private life”. Such
interference is not allowed in accordance with article 8 ECHR, unless it is “in accordance with the
law”, pur
sues one or more of the legitimate aims referred to in paragraph 2 of the article and is
necessary in a democratic society. (Huvig v. France, para. 25). The right to privacy as established in
article 8 ECHR is not an absolute right. The second paragraph o
f the article enlists the limits that the
legislator has been designating for its application. The interference of the public authorities with the
private life and the correspondence of the individuals is possible and in accordance with the laws if it

Page
29

of

41

(version
19
-
02
-
2013)



is

necessary in a democratic society and it is counterbalanced by one of the following interests:
national security, public safety, economic well
-
being of the country, for the prevention of disorder or
crime, for the protection of health or morals, or for th
e protection of rights and freedoms of others.
The Court has been further elaborating upon these criteria and has been establishing a clear test to
be used in case of interference of a public authority with the rights of the individuals. This paper
looks
at the criteria and conditions elaborated by the European Court of Human Rights on the
lawfulness of wiretapping of telephone communications, and extends them to the new forms of
communication via internet. By focusing on the Courts’ interpretation of the
legal rules, the paper
assesses if such an approach is sufficient to protect the right to privacy of individuals, despite current
technology advances. The paper is divided into three parts. The first part is dedicated to a brief
analysis of article 8 ECH
R and the test developed by the Court for assessing the compatibility of
situations of interference from public authorities with the provisions of the article. In the second
part, the test is confronted with three special situations of interference: incide
ntal recording of
communications; tapping of communications by private individuals, part of the communication with
some form of involvement from the public authorities; and strategic surveillance. The first two
situations have been chosen for being in the
border
-
line between what is to be considered as legal
and illegal interference. The third case gives the possibility to assess the level of protection the Court
has been giving to the individuals in cases of mass surveillance. In the last part of the paper

conclusions will be drawn and an outlook will be given on the application of the findings in cases of
interception of communications via the internet. By analyzing the case law on the interference of
public authorities with private communications the aut
hor analyses the legal test that the Court
applies to distinguish between lawful interferences and violations of the right to privacy. The scope
and extent of the legal test are examined in the context of two special situations. Firstly, when an
individual

is instigated by the public authorities to record private communications without him or the
authorities possessing a warrant. (M.M. v. the Netherlands, A. v. France) And secondly, when the
authorities incidentally record incriminating conversations of thi
rd parties without having due
warrants. (Kruslin v. France, Lambert v. France) In the age of internet communication various forms
of strategic surveillance have emerged. The potency of the new technology is awe inspiring and
easily surpasses the ´old´ te
chnology in terms of intrusiveness and efficacy. The question arises if the
legal test developed by the Court is adequate to safeguard our privacy in light of the ease with which
internet communications can be intercepted and recorded and the speed by whic
h this
communications can be scanned for incriminating key words. (Weber and Saravia v. Germany, Liberty
v. UK, Kennedy v. UK) The paper finds that the individuals are protected in cases in which their
communication is recorded by individuals that acted i
n collaboration with the authorities. These
cases are considered as interference without a mandate, and are therefore falling within the
prohibition of the first paragraph of article 8 ECHR. The protection is less strong in cases of incidental
recording of

communications. In these cases the risk is being created that the interception of
communication on the basis of a valid mandate for one person, is extended to all persons making use
of the same intercepted machinery. In cases of strategic monitoring, the
protection of the individuals
is also found to be weak since the Court does not enter in the merit of the national authorities’
decision on the necessity of the intercepting measures and limits itself to the checking of the quality
of the laws.



Page
30

of

41

(version
19
-
02
-
2013)



Between t
he international conflicting priorities of Cyber Security, Copyright
-

and Data Protection

Philipp E. Fischer, SuiGeneris Consulting,
pfischer@suigeneris
-
consulting.com


PIPA and SOPA are shelved for

now, but what does the future hold? By vote of 4 July 2012 the
European Parliament decided not to ratify ACTA. But IPRED2, TPPA, CETA and other four
-
letter
-
regulation
-
initiatives show that the end of ACTA is far from being the end of the contents of ACTA.

Some apparently are still of the opinion that the failure of ACTA was an unhappy accident, but its
contents were actually correct. A legitimate concern of ensuring the enforcement of IP rights
through these initiatives has to be acknowledged, but a righ
t balance must be found between
demands for the protection of IP rights and the rights to privacy and data protection. The means
envisaged for strengthening enforcement of IP rights must not come at the expense of fundamental
rights and freedoms of individ
uals to privacy, data protection and other rights such as the
presumption of innocence and effective judicial protection. The aim of this paper will be to bring
some light into the contents and relationships between the mentioned initiatives, starting wit
h an
analysis of the privacy
-
relevant provisions in ACTA. It will then describe the essential contents of
IPRED2, CETA, TPPA and similar agreements. Although Information about these contents is sparse
and politicians are still undertaking secret negotiatio
ns, which made the European Parliament see
red, this paper will then try to find a common denominator of their threats to privacy and data
protection. The final section draws the attention on future developments and achievable solutions.
Until then applies

what I named as: “Copyright protection in the digital environment, please, yes, but
not like this!”




Page
31

of

41

(version
19
-
02
-
2013)



Stream
7

-

Children and Online Privacy


Cyberbullying: a change in the concept of privacy?

Albert Verheij , University of Groningen,
a.j.verheij@rug.nl

Introduction Cyberbullying is a general term that means the harassment of someone by use of
electronic media (email, sms, social media). It encompasses both situations where cyberbully and
victim k
now each other and where they are strangers to one another. Third party participation in or
knowledge of the bullying varies. Research questions In this contribution the focus will be mainly on
a specific category of cyberbullying, namely cyberbullying (
i) among children that takes place (ii) in
public (e.g. by means of social media such as facebook). One child sending degrading messages (sms,
email) to another child without involving other persons falls outside the scope of this contribution.
The questio
n to be answered is twofold:

A.

To what extent does the concept of privacy change as a result of cyberbullying (in general)?
B.

What are the resources that private law (including self
-
regulation) provides against
cyberbullying among children on social med
ia? In answering Question B use will be made of the
Dutch experience so far. The aim is however to formulate tentative conclusions and lessons to be
learned that are relevant beyond the Dutch context. Cyberbullying and Privacy (Question A) It will
be inv
estigated to what extent cyberbullying blurs the traditional boundaries between the private
and the public sphere. Whereas in the past the victim was safe in his own house, nowadays he will be
confronted with the bullying every time that he is online. It c
ould be argued that being online is a
personal choice but the question arises whether that is presently true for children. How big is the
peer group pressure to be active online? Can cyberbullying be regarded as a violation of a privacy
and if so to what

extent does this notion of privacy correspond with the legal notion of privacy that
developed in the offline world? E.g. ECHR 24 June 2004, 59320/00 (Von Hannover/Germany) in
which case harassment in the offline world by reporters amounted to a violation

of privacy (art. 8).
Private law resources against cyberbullying among children (Question B) Private law resources are to
be broadly understood and include preventive action (e.g. in general terms and conditions of social
media, guidelines and codes of
conduct of schools), remedial action (actions aimed at mitigating the
damage caused by bully), and a legal obligation to pay damages and court orders. The view is taken
that just as with offline bullying, apart from the bully, many actors play a role in
the bullying with
various degrees of activity/passivity: offline friends and classmates of the bully, online friends of the
bully, parents of the bully, schoolteachers and ISPs. The aim is to analyze what preventive and
remedial measures have been tested
in practice and to what extent actors are under a legal duty to
take such measures or to pay damages when they refrain from doing so. Codes of conduct and anti
-
bully programs of schools and general terms and conditions of ISP and s
ocial media will be
examined.


Page
32

of

41

(version
19
-
02
-
2013)



Privacy policies as tools to improve consent


Do they really work for
children?

Federica Casarosa, European University Institute,
federica.casarosa@eui.eu

One increasing clas
s of users of ICTs, and in particular of Internet related services, is children and
young people. Although youngsters are attracted by Internet is an ideal playground, their attention is
limited and they easily switch from one service to a new one as soon
as they get bored. This
approach pushed Internet Service and Content providers to differentiate their products and services
so as to gather as many users for the longer time possible. As a matter of fact, minors leave traces of
their passage and provide in
formation about themselves. This, given the expanded possibilities to
collect, organize and store thousands of data, can provide the materials for databases that include
detailed profiles which can be used (directly or indirectly) for marketing purposes. T
hus, websites can
build in systems to help them monitor and understand children preferences, so as to have the
possibility to tailor the content and the services upon minors identified interests, and push those
interests into specific buying trends throug
h the way in which the content and the services are
provided. In order to help individuals (and in this case minors) understand how information will be
used and what the consequences of this are for them, either in Europe and in US legislation has
imposed
the inclusion of a privacy policy (or notice) that should provide information about the data
treatment and the data controller, but it can also include further information such as data subjects'
access rights or security arrangements. The presentation wil
l provide an analysis of some of the
privacy policies available online provided by companies that focus specifically on children and by
social networking sites to which generally young people are more and more attracted. The selection
of privacy policies t
o be analyzed in this study has been based on two main criteria: the type of
website and the legal system they referred to. On the one hand, the type of websites selected can
fall into three categories: websites targeting directly children (CW), websites o
f a general audience
such as online broadcasting channels where children and young people are addressed only in a
limited part of the website, (GAW); and social networking websites (SNS). Although these categories
are not so different in terms of services
provided, the objectives are not exactly the same. In the first
two categories brand building is the underlying objective. In this case, participation of minors,
through personal data and on
-
site behavior, provides information about (young) consumers' tast
e
and preferences, which can help, for instance, in defining customized marketing techniques, or
different broadcasting strategies. In the third category instead, the objective is the creation of an
online identity which could be shared and enriched by the

communication and the participation of
other users. In this latter case, the dissemination of personal data by users, be they adult or minors,
is at the very basis of the website model, including not only personal data, but also those data that
are define
d 'sensitive' by the European Data Protection Directive. On the other hand, in order to
emphasize differences and similarities among privacy policies, the selection of the website was also
based on nationality of the enterprises that create them, taking a
s a point of reference their main
seat. The analysis address two types of elements: descriptive elements which take into account the
formal characteristics of the privacy policies available on the selected website; and substantial
elements which are requi
rements put into effect on the basis of the Data Protection Directive. The
findings of the research show that the inclusion of a privacy policy on the website, whether or not
the latter is focused on children and young people, is the rule. However, the di
fferences among the

Page
33

of

41

(version
19
-
02
-
2013)



results push for a more detailed evaluation; in particular, it is not a common practice for websites to
clarify in a more open way the main elements of the privacy policy on the website. In terms of
readability, only part of the policie
s analyzed tries to attract the attention of the reader through a
more user
-
friendly visual presentation of the text, such as question and answers style or
paragraphed text. Low level of 'legalese' terminology is generally applied on all CW and SNS;
howev
er, European websites tend usually to reference Data protection legislation acts and
provisions, which does not encourage comprehension or readership. In the majority of cases, the
privacy policy is a stand
-
alone text complete and comprehensive, also in t
he cases where minors'
protection clauses are included in the more general privacy policy. However, in one case, the cross
-
reference between general and children focused privacy policy limited the understanding of the text,
given also the extraordinary len
gth of the general privacy policy. Prominence of the privacy policy
tab is usually interpreted by enterprises as the mere inclusion of the tab within those corporate ones
at the end of the page, without any further emphasis on such document. On the one ha
nd, this
positioning in the webpage is a common practice that users are well aware of and generally used to;
on the other hand, this implicitly communicates to users/parents that the privacy policy is a
fundamental part of the site. However, in all these c
ases the privacy policy tab is not differentiated
from the other features of the page. Contact points are always provided for questions and
information on redress mechanisms. Usually contact points are postal addresses, but in few cases,
more options were

provided, such email addresses and free
-
toll telephone numbers. On the one
hand, this can increase the difficulties for the websites to handle the requests by users in a shorter
time, but, on the other hand, it can increase the possibility to accommodate
different users'
preferences. Opt
-
in is the main consent system, in particular for CW given the obligations arising
from the COPPA regulation for under 13
-
year old users for US based websites. In case of denial of
data treatment, however, only few website
s permit access to games, videos, etc. without
registration. In case of SNS, denial to provide consent leads to refusal of service, given that the
website is obviously based on the free flow of users' data. However, lack of registration does not
mean that
websites do not collect the information which does not allow to personally identify users.
In terms of clarity and conspicuousness, only in few cases can the privacy policy be considered highly
understandable; in particular, our evaluation was based on th
e purpose specification items. Except
for SNS, the set of data collected and corresponding purposes are not always clearly defined nor
described; yet in two cases the types of data collected are included in an open list, with only few of
them being explici
tly justified through examples. Moreover, the same websites that acknowledge in
their privacy policy a data minimization approach do not put it into practice when expressing the
types of data collected and their purposes. What is evident from the previous
ly listed points is the
fact that privacy policies can be widely differentiated documents, yet complying with accepted
guidelines and legal requirements at least at a basic level. Such an evaluation, however, is based on
the perceptions and level of unders
tanding of an adult


educated and used also to legal terminology


on adult
-
targeted privacy policies. As a matter of fact, none of the analyzed privacy policies were
directed to children, neither as to their style and structure nor as to their explicit r
eceiver. Only in the
case of SNS, where youngsters’ participation is more than an exception, the terminology and design
is more clear and concise and, at the same time, more understandable for a younger and less
educated reader. Improvements in this direc
tion could take into account two possible suggestions:
child
-
targeted design and multi
-
layered privacy policies. The multi
-
layered privacy policy can be

Page
34

of

41

(version
19
-
02
-
2013)



useful also to improve minors' understanding of privacy policies, as the three layers could be framed
not only in terms of number of information included, but also in terms of different level of
understanding of users. Statistics show that children and young people are, in the majority of cases,
alone in front of the computer and decide autonomously conc
erning the provision of their data.
Given this framework, the possibility to provide minors with a more easily readable, and
consequently understandable privacy policy could improve their capability to engage in critical
analysis and reach a really 'inform
ed consent' to provide their data. Probably the outcome would still
be a flow of their data, but in this case their decisions about privacy would be directly linked to
disclosure. In this direction, facilitating children to read and understand the privacy
policies of
websites could be a way to improve minors’ awareness of data protection rights, and at the same
time to provide them with the tools to appreciate when and how to offer their personal information
for access to games and social networking sites.




Page
35

of

41

(version
19
-
02
-
2013)




Stream
8

-

Consent in data protection regulation

(2)


Two proposals that go beyond the data subject's consent, respecting online
privacy rights

Luca Bolognini
,

Italian Institute for Privacy
,
lucabolognini@istitutoitalianoprivacy.it

Consent is an obsolete juridical concept. In the field of privacy and data protection matters the
concept is no longer useful or effective. This is increasingly true in the digital environment. The mere
click is
something too simple, too easy to do and importantly, users often do not understand the
complexity and extent of their click, the deepness of the consequences of data collection.
Fortunately, however, in the new General Data Protection Regulation that was
proposed by the
European Commission on the 25th of January 2012, consent is only considered to be one of several
different ways to legitimately process data. Principles and solutions regarding the lawfulness of data
processing are included in the Regulat
ion. In Letter F, point 1 of Article 6 we find that the processing
of personal data is lawful in the case that “processing is necessary for the purposes of the legitimate
interests pursued by a controller, except where such interests are overridden by the
interests or
fundamental rights and freedoms of the data subject which require protection of personal data, in
particular where the data subject is a child. This shall not apply to processing carried out by public
authorities in the performance of their ta
sks.” Thanks to the interpretation of legitimate interest one
can conclude that data processing can be lawful even without the consent of the data subject. But
what is true legitimate interest? How can we define legitimate interest and who will define it?

Over
the years the concept of legitimate interest, already present in the EU privacy legislation before the
Regulation, has been generally established based on precedence. Its definition is, however, not in
any way certain or fixed. In the absence of a s
trict definition of legitimate interest, decisions can
therefore vary from judge to judge, from DPA to DPA. The definition is extremely confused, vague,
and the current challenge is to make it clearer and more precise. BEYOND THE DATA SUBJECT’S
CONSENT,
FIRST SOLUTION: TO LINK LEGITIMATE INTEREST TO CERTIFICATION AND CODES OF
CONDUCT One solution to the problem of consent is that of specifying the definition of legitimate
interest in Article 6, Letter F. This solution would link the norm found in Article

6 (Letter F), to other
specific norms contained or included in the Regulation. Through the linked interpretation of the two
norms, we could obtain a better and more
-
specific definition of legitimate interest. Articles 38 and
39, the Articles that regard
codes of conduct and certification, written as two separate juridical
instruments, are rather useless. The adoption of a code of conduct and third party certification
currently have little actual impact and prove to be rather ornamental. However, in the t
ext of the
proposed Regulation, Article 6, Letter F and Articles 38 and 39 could be linked together in order to
provide a simple and effective solution to the current dilemma of legitimate interest. The
harmonization of the two concepts would allow for leg
itimate interest to be presumed in the case
that one is certified for specific data processing purposes. BEYOND THE DATA SUBJECT’S CONSENT,
SECOND SOLUTION: EXCHANGE COMMERCE The EU risks missing the boat of digital advertising,
which not only concerns
websites, but also the publishing and digital television industries,

Page
36

of

41

(version
19
-
02
-
2013)



telecommunications and consequently all sectors digitalizing the objects used in the relationship
between users and consumers. Recent changes to cookie laws all over the EU, in the transp
osition of
Directive 2009/136/EC, can undermine the ICT market as a whole, precisely in the moment when it
would be most important to support it as a way out of the crisis. Following the “opt
-
in” model, by
the means of active action, each user would say

“yes” to cookies. The conversion of users accepting
cookies would then fall dramatically and online behavioral advertising would consequently fail. Even
the economy based on advertising can benefit from a new interpretation of the rules regarding
person
al data protection or content in private communication tools: the old Europe risks lagging
behind, powerless, in front of the advent of new business models, in which services are provided
without monetary payment (yet not for free, despite appearances) in
exchange for the possibility of
processing the user’s personal data for the purpose of customized marketing. A rigid “opt
-
in”
regime


as the one adopted by the EU towards the European and international Internet players


can be fair in abstract conceptu
alization, but in practice reveals all of its limits. It therefore makes
sense to find new legal solutions that are suitable for the web environment and its dynamics, rather
than vainly trying to impose strict obligations and rules on the world of the bit.

One solution


that
many believe can already be deduced by interpreting the rules of consent for the processing of
common personal data


could be to consider the mandatory exchange between services and
profiling for the purpose of online marketing as bo
th possible and lawful (we could call it
“E(xchange)
-
Commerce”, in short, “EX
-
Commerce). We are talking about those contracts, in which
the processing of data, for purposes other than those strictly necessary for the execution of the
performance provided b
y the professional, constitutes the object of obligation for the user that
makes use of services without paying with money. The use of common (not sensitive) user data as a
medium of exchange to deliver services does not seem to be wrong if it is completel
y transparent
and clear that the service of e
-
commerce is not for free but instead requires compensation “in
obligations of being profiled and therefore receiving customized advertising.” Such an approach
would not absolutely change the rule of consent
for profiling and online behavioral advertising
through cookies, presently at the stage of delayed transposition in Italy and other EU countries, as it
currently refers and continues to refer to cases in which the contract does not explicitly provide such
obligation and in which the service indeed continues to be presented as “free” to users.


Paper on legitimate interests of the data controller (lidc): is lidc a viable
alternative to data subject's consent?

Paolo Balboni, Rosario Imperiali, Daniel Coope
r, Milda Macenaite, European Privacy
Association
,

pbalboni@europeanprivacy.eu

This paper considers whether under both the current data protection framework (Directive
95/46/EC), and that as foreseen under the proposed EU Regulation (“Regulation”), solutions may be
envisaged that foster data subjects’ rights and safeguards and, at th
e same time, smooth the
progress of the use and flow of information as well as granting economic opportunity and growth in a
way that is both compatible with privacy interests and in line with societal expectations. The paper
first provides a broad overvie
w of the context of the current legislation and the legal bases available
for the processing of personal data. It then describes the limitations of the consent legal basis, and

Page
37

of

41

(version
19
-
02
-
2013)



proposes a theory for a balancing test (sufficiency test) that provides for suf
ficiency of protection.
There follows an examination of the way Legitimate Interests of the Data Controller (LIDC) is
interpreted, implemented, and invoked across Europe. It is finally argued that the draft Regulation
proposed by the European Commission c
ontains a set of requirements and obligations that can be
described as a “Data Protection Compliance Program” (DPCP) which itself provides for an appropriate
right balance between data protection and free flow of information/data. After a brief introductio
n
on the topic and the aim of the paper, Section 2 seeks to explain the core principle of “appropriate
balance” between the protection of individuals with regard to the processing of their personal data
and the free movement of such data. In Section 3 the
ratio of the actual requirements for legitimate
processing of personal data is checked against the ‘appropriate balance’ principle. The analysis then
specifically focuses on the LIDC as one of the criteria for lawful processing of personal data, digging
i
nto LIDC history, local implementation by Member States and industry use of it to better understand
and define its boundaries (Section 4). Data subjects’ consent, as another main instrument to
legitimize the data processing is touched upon in Section 5; an
d it is concluded, on the basis of what
has previously argued, that both LIDC as implemented today and subject’s consent are equally
unsatisfactory in ensuring an appropriate balance between free flow of information/data and
effective protection of the dat
a subject. In Section 6 the shift of the “axis on legitimacy” for personal
data processing introduced by the Regulation is presented. In Section 7, the DPCP is specifically dealt
with. And in Section 8 it is explained that DPCP, together with transparent,
easily accessible and
intelligible information to the data subject, can be seen as a new legitimacy requirement “per se”. In
fact, it ensures actual privacy protection, it improves information/data flows and, at the same time,
grants flexible implementatio
n that an accountable organization needs. Conclusions and
recommendations are provided in Section 9. More precisely, the paper is based on the following
arguments: (i)

Data protection should provide for an appropriate balance between data subjects’
rights
and safeguards, and the free flow of information. (ii)

The criteria for lawful processing
should be subject to the sufficiency/balancing test. (iii)

The sufficiency test evaluates/establishes the
ultimate level of protection that can effectively be guarant
eed with respect to the processing. (iv)

Data subjects’ consent and the other legitimacy requirements are not necessarily evidence of
sufficient protection and instruments other than consent appear to offer a more flexible
implementation of the user contr
ol paradigm. (v)

A data protection compliance program, such
as the one set forth under the proposed Regulation, which must be complied with by all controllers
under any circumstance, is evidence of effective and sufficient personal data protection and prop
erly
responds to the needs of user control. (vi)

The utility of traditional legal bases such as consent


at least for those data processing with a lesser impact


seems to have faded away. (vii)

Once
a comprehensive privacy program has been fully implement
ed, LIDC might be the principal
instrument to balance data subjects’ rights and safeguards, on the one hand, and free use of
information, on the other. (viii)

Finally, companies should be allowed to process personal data which
do not present a significant
risk to data protection upon only the condition of having rightly
implemented the data protection compliance program, of which they bear the burden of proof,
because the legitimacy of their processing stands on th
e compliance program “per se”.



Page
38

of

41

(version
19
-
02
-
2013)



Behavio
u
ra
l Targeting. How to Regulate?

Frederik Zuiderveen Borgesius
,

Institute for Information Law
,
University of
Amsterdam
,


f.j.zuiderveenborgesius@uva.nl


Behavioural Targeting. How to Regulate? The paper
concerns the following question. In the context
of behavioural targeting, how could regulation be improved to protect privacy, without unduly
restricting the freedom of choice of Internet users? Many marketing companies are monitoring the
online behaviour

of Internet users to build a profile of these users to target them with customized
advertising. This practice, known as behavioural targeting, could have benefits for marketers and
consumers, but it also raises privacy concerns. Using cookies or other tec
hniques, companies compile
detailed profiles based on what Internet users read, what videos they watch, what they search for
etc. Behavioural targeting forms the core of many privacy related questions on the Internet. It is an
early example of ambient in
telligence, technology that senses and anticipates people’s behaviour to
adapt the environment to their needs. This makes behavioural targeting a good case study to
examine some of the difficulties that privacy law faces in the twenty
-
first century. The p
aper
explores two ways of privacy protection. The first focuses on empowering the individual, for example
by requiring companies to obtain informed consent of the individual before data processing takes
place. In Europe for instance, personal data “must be

processed fairly for specified purposes and on
the basis of the consent of the person concerned or some other legitimate basis laid down by law”
(article 8 of the Charter of Fundamental Rights of the European Union). The phrase “on the basis of
the consen
t of the person” seems to be a loophole in the regime, as many Internet users click “I
agree” to any statement that is presented to them. Insights from behavioural economics cast doubt
on the effectiveness of the empowerment approach as a privacy protectio
n measure. The second
approach focuses on protecting rather than empowering the individual. If aiming to empower people
is not the right tactic to protect privacy, maybe specific prohibitions could be introduced. Some might
say that the tracking of Inter
net users is not proportional to the purposes of marketers, and
therefore should be prohibited altogether. But less extreme measures can be envisaged. Perhaps
different rules could apply to different circumstances. Are data gathered while Internet users ar
e
looking to buy shoes, less sensitive than data that reveal which books they consider buying or which
online newspapers they read? Do truly innocent data exist? One of the most difficult issues would
be how to balance such prohibitions with personal aut
onomy, since prohibitions appear to limit
people’s freedom of choice. The paper draws inspiration from consumer law, where similar problems
arise. When should the law protect rather than empower the individual? A careful balance would
have to be struck bet
ween protecting people and respecting their freedom of choice. The paper
concludes with recommendations to improve privacy protection, without unduly restricting people’s
freedom of choice.




Page
39

of

41

(version
19
-
02
-
2013)



Stream
9

-

Alternatives to current approaches


Information
and Communication Flow and Deposit Control

Filip Petrinec, Thomas Lenhard,

Michal Gregus
,

Faculty of Management, Comenius
University in Bratislava
,
michal.gregus@fm.uniba.sk

This article is dedicated to the topic of the right to control the flow of data including the deposit of
communication. The authors are focusing on the issue of executing this right in the right of the Slovak
Electronic Communications Act. The ma
in point of the article is tracking a probable realization
of the constitutional
-
law violation of this kind of data retention in relation to the right for personal
privacy. Since this interstate act was a result of implementation of the EU Direct
ive 2006/24/EC
on the retention of data generated or processed in connection with the provision of publicly
available electronic communications services or of public communications networks this topis is
gaining a kind of global
-
E
uropean dimension
.


TRUST
-
EX


object
-
oriented approach to on
-
line privacy

Augustin Mrazik
,

Jan Domankus
,
Lukas A. Mrazik
,
eGov Systems s.r.o., Bratislava, Slovakia

augustin.mrazik@eGovSystems.sk


TRUST
-
EX is a concept for enab
ling on
-
line privacy and trust on the Internet, based on a few basic
principles known from OOP (Object
-
Oriented Programming). The functionality is highly decentralized
and based on the roles and responsibility of the users who are building a “social networ
k of mutual
trust”. Instead of technological solutions based on PKI it is more understandable to users and much
easier to use. The major aim is to enable well
-
known and respected principles of the Civil Code and
legal certainty (from the real world) to be
guaranteed also in the cyberspace. This paper discusses
only the base of TRUST
-
EX


identity of natural and legal persons, their mutual representation and
the approach to guarantee them in the cyberspace. TRUST
-
EX as a whole contains also acts of
persons a
nd legal relationships between persons (e.g. contracts). Encapsulation and responsibility of
the owner The major principle used is “encapsulation”: every user is represented in the cyberspace
by a sole object (proxy)


his/her electronic identity e
-
ID, i
nstead of dozens of registration on diverse
portals. Within e
-
ID the user maintains all his/her personal data (name, address, photo, mobile
phone number, SSN, scan of diploma etc.) and he/she authorizes third parties to access these data.
Data is disclose
d to authorized parties “just
-
in
-
time” by “late binding” upon request using the
identifier of the user and that one of the particular data. Hence, there are no redundant and obsolete
copies of private data stored in different databases, and all use of the
private data by third parties is
controlled by the owner. Every owner has all his/her data under his/her control and he/she is
responsible for all its use


there is no “misused” data or data which had leaked from some database
anymore. Natural and Legal

Persons, Representation, Minors Any person may mandate his
representative


some other person


to act in his/her name and responsibility. For legal persons act
their representatives. The acts of minors can be done (or supervised) by their parents. Iden
tification
of persons and data For identifying persons in Trust
-
Ex (e.g. from a portal requesting the personal

Page
40

of

41

(version
19
-
02
-
2013)



data) alphanumerical identifiers are used. Identifiers are randomly generated by the systems without
any relation to data of the owner (i.e. the

identifiers are anonymous). However, the user has the
opportunity to get an “identifier on request”


e.g. based on his/her phone number or containing
his/her name. This may be used e.g. for marketing purposes by companies. Identifiers may change in
time,

i.e. the owner may revoke some of his/her old identifiers and get new ones


just like with
phone numbers. However, at a certain time the particular identifier can belong solely to one person.
In a similar way, each data of a person is identified. Particu
lar data may be accessed (disclosed) by
“person
-
ID.data
-
ID” or simply by “person
-
ID” by default (e.g. when accessing a generally used
attribute, e.g. address). Protection of the identity and data Login to the identity server and
maintenance of the person
al data can be protected by two
-
phase authentication. Users may use
GRID card, smart phone application, OTP (one time password sent by SMS) or some other means
(smart cards, biometrical methods, third
-
party authentication) for these purposes. No data can
be
modified by any other means (e.g. from another server or application). On the other hand, this
trusted identity is a secure way for logging in to another places. Trusted identity and trusted data In
order to offer trusted information, any data of a u
ser may be verified by other parties (e.g.
presenting original papers face
-
to
-
face) and cannot be modified by the owner anymore (without
losing its verification). Verifying the basic data (name, address, date of births etc.) means actually
verifying the id
entity of the owner. Trusted parties (e.g. municipality office, notary, bank) denote the
level of trust of such verified data. In this way a “network of trust” is created, similar to the trust in
the real world. People trust some other people (e.g. notari
es and lawyers) and organizations (e.g.
municipal office or post office). Hence, their will also trust the data which had been verified by them.
Their real identity may be checked by checking the verification of their identification data etc. Any
data may
be verified by several trusted persons


the more of them, the more trusted is the
particular information. One password, one identity and no leaking data The user uses only one
credentials and optionally additional authentication means for higher securit
y (GRID, SMS...) when
logging in to any of the associated portals via the e
-
ID identity server. The e
-
ID identity server offers
all the functionality using SSO (single sign
-
on) approach for all associated portals in order to enable
them to use secure authe
ntication and authorization of operations, to access private data and to use
all associated functionality (e.g. micropayments for services). Portals do not need to store any
personal data of their users (not even their login names), still they may store th
eir specific data
associated with the particular user. Security


smart phone application, GRID cards and more The
whole approach had been designed to be as practical and easy to use as possible. Therefore GRID
cards, OTP (SMS) and smart phone applicati
on are used as major security means. However, the
system is open to any other, even highly sophisticated authentication systems


smart cards,
biometrical means or operational authentication systems of third parties (e.g. banks, Bloomberg
etc.). Using the
GRID card or smart phone application the users are able not only to authenticate
securely on
-
line, but also to identify themselves when physically accessing controlled space (e.g.
office, shop) or paying from their credit (virtual wallet) in a fast and sim
ple way using a widely used
bar code reader. Companies using Trust
-
Ex on their portals may issue their loyalty cards as Trust
-
Ex
GRID cards


enabling their users to have only one card for many registrations and purposes still
bearing the logo of the issu
ing company. Still, for maximum security and trust, users may revoke
their old GRID card and get a new one downloaded from the server or sent as a plastic card by s
-
mail
anytime they assume the old one had been compromised. “Right to be forgotten” and d
eath The

Page
41

of

41

(version
19
-
02
-
2013)



user has the right to be forgotten in the system. This means that he/she will not be anymore
accessible by his/her former identifier. There are two possible situations: the user will revoke his/her
old identifier(s) and get a new one(s). Or he/s
he revokes all of his/her identifiers. Nevertheless, the
user remains his/her identity and later


at his./her wish


he/she may get a new identifier. The
system includes also he death of a natural person (as well as cancellation of a legal person)


this
has
to be reported and verified by a trusted person and will be recorded in the system. Such extinction
of a person is an event with which some actions may be associated (e.g. publishing some
documents).