scispace - formally typeset
Search or ask a question
Book ChapterDOI

Privacy policy referencing

TL;DR: This article describes a new approach called Privacy Policy Referencing, and outlines the technical and the complementary legal framework that needs to be established to support it.
Abstract: Data protection legislation was originally defined for a context where personal information is mostly stored on centralized servers with limited connectivity and openness to 3rd party access. Currently, servers are connected to the Internet, where a large amount of personal information is continuously being exchanged as part of application transactions. This is very different from the original context of data protection regulation. Even though there are rather strict data protection laws in an increasing number of countries, it is in practice rather challenging to ensure an adequate protection for personal data that is communicated on-line. The enforcement of privacy legislation and policies therefore might require a technological basis, which is integrated with adequate amendments to the legal framework. This article describes a new approach called Privacy Policy Referencing, and outlines the technical and the complementary legal framework that needs to be established to support it.

Summary (3 min read)

1 Introduction

  • Data protection law regulates the processing of information related to individual persons, including their collection, storage, dissemination etc.
  • A brief historical overview over privacy regulation and PET is given in [15]: Starting in the 1970ies, regulatory regimes were put on computers and networks.
  • Starting with government data processing, along the lines of computerization of communication and workflows, explicit rules like the European Data Protection Directive [7] have been put in place.
  • Most of these criteria, including schemes like Datenschutz-Gütesiegel [16], provide checklists with questions for the auditors.
  • Privacy policies are sometimes used by organizations that collect and process personal information.

2.1 Business decision-making and privacy technology

  • For any deployment of PET into information systems, the effectiveness of the PET measure against threats is important [15].
  • In the computer science field, several contributions provide information theoretic models for anonymity, identifiability or the linkability of data, e.g. in [27]or in [10].
  • Both papers build mathematical models that are rather impractical for usage in the evaluation of large-scale information systems.

2.2 Inadequacy of Technical Privacy Strategies

  • Public surveys indicate that privacy is a major concern for people using the Internet [6].
  • One attempt to address privacy concerns and thereby increase user trust in the Web is the W3C’s Platform for Privacy Preferences (P3P) Project [8].
  • Detractors say that P3P does not go far enough to protect privacy.
  • Originally the iPrivacy software generated a one-off credit card number for each transaction.
  • The user accesses the Web via a Lumeria proxy server, which shields their identity from merchants and marketing companies whilst enabling marketing material that matches their profile to be sent to them.

2.3 Inadequacy of Specifying Privacy Policies

  • Many data controllers specify privacy policies that can be accessed from the interface where personal information is being collected or where consent to do so is given.
  • Users are normally required to accept the policies by ticking a box, which all but very few do in a semi-automatic fashion.
  • This will be explained in further detail below.
  • A privacy policy may fulfill several different functions [4] (p.239).
  • Nevertheless, for most people it is challenging to assess whether they should consent to the processing of their personal data under a given privacy policy, particularly if it is ambiguous and permits a wide range of forms of processing personal data, possibly exceeding what would be permitted under the applicable data protection law.

3 An Infrastructure for Privacy Policy Referencing

  • The fundamental principle of Privacy Policy Referencing is that all personal information must be tagged or associated with metadata that relates it to the applicable privacy policy, and possibly to the point and time of collection.
  • This would enable users or authorities to audit systems and applications where personal information is being processed, and to determine whether they adhere to applicable privacy policies.
  • By making it mandatory to always have policy metadata associated with personal information, it becomes a universal principle for referencing privacy policies.
  • Their approach, however, assumes that the underlying hardware platform, and the software running on it, are so-called trustworthy systems based on the Trusted Computing specification.

3.1 The Technical Framework

  • Privacy policy metadata will require the definition of a common metadata language in XML style.
  • This means that each privacy policy must be uniquely identifiable, so that organizations must keep records of such identifiable privacy policies that have been used.
  • The metadata does not need to contain any additional personal information, because that would be irrelevant for potential audits of policy adherence.
  • The organizations must then find a solution for associating the personal information with metadata stored elsewhere.
  • It can be noted that their scheme has similarities with the scheme for electronic signature policies described in [25] where a specific signature policy has a globally unique reference which is bound to the signature by the signer as part of the signature calculation.

3.2 The Policy Framework

  • It is very difficult for users to understand privacy policies when each organization specifies a different policy and when typical policies are 10 pages or more.
  • The combination of rules into specific profiles can be denoted as the PRP (Privacy Rules Profile) framework.
  • In international trade law, the Incoterms [17] offer a widely used catalogue of specific contract terms that can be quoted when buying or selling goods.
  • A number of IPR licensing issues regarding open source software can be easily regulated by referring to specific predefined licenses.
  • By having limited set of standardized policies, it would be possible for users to become educated and familiar with what the respective policies actually mean, and the level of protection they provide.

3.3 The Management Framework

  • Organizations would need to manage their privacy policies according to strict criteria, and define a way guaranteeing their integrity and authenticity.
  • This can e.g. be achieved by letting independent third parties sign hashes of each particular policy or policy profile which would allow changes in policies or profiles to be noticed, or to deposit the privacy policies with independent third parties such as national information commissioners and data protection inspectorates.
  • Privacy policy repositories that are suitable for long-term archival of verified policies might me necessary with respect to long-term legal validity.
  • Organizations will also need to define processes for creating metadata and to adapt applications where personal information is being processed so that the metadata can be appropriately handled during storage, transfer and processing.

4 Conclusion

  • The current approach to ensuring personal information privacy on the Internet is ineffective in providing privacy protection in the age of distributed, networked services.
  • The approach described in this paper changes the way privacy policies can be specified by service providers, and compliance be verified by auditors or users.
  • By providing certified template policies, users gain oversight of policies that have been verified.
  • At the same time, auditors can verify system states against policy claims.
  • Introducing this framework might also require the introduction of incentives, for example by making it mandatory to include privacy policy metadata with personal information.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

Privacy Policy Referencing
Audun Jøsang
1
and Lothar Fritsch
2
and Tobias Mahler
3,2
1
UNIK University Graduate Center - University of Oslo
josang@unik.no
2
Norwegian Computing Center
Lothar.Fritsch@NR.no
3
Norwegian Research Center for Computers and Law - University of Oslo
tobias.mahler@jus.uio.no
Abstract. Data protection legislation was originally defined for a context where
personal information is mostly stored on centralized servers with limited con-
nectivity or openness to 3rd party access. Currently, servers are connected to the
Internet, where large amounts of personal information are continuously being ex-
changed as part of application transactions. This is very different from the origi-
nal context of data protection regulation. Even though there are rather strict data
protection laws in an increasing number of countries, it is in practice rather chal-
lenging to ensure an adequate protection for personal data that is communicated
on-line. The enforcement of privacy legislation and policies therefore might re-
quire a technological basis, which is integrated with adequate amendments to the
legal framework. This article describes a new approach called Privacy Policy Ref-
erencing, and outlines the technical and the complementary legal framework that
needs to be established to support it.
1 Introduction
Data protection law regulates the processing of information related to individual per-
sons, including their collection, storage, dissemination etc.
Privacy concerns exist wherever personally identifiable information is collected and
stored in digital form or otherwise. Some forms of processing personal information
can be against the interests of the person the data is associated with (called the data
subject). Data privacy issues can arise with respect to information from a wide range of
sources, such as: Healthcare records, criminal justice investigations and proceedings,
financial institutions and their transactions, private sector customer data bases, social
communities, mobile phone services with context awareness, residence and geographic
records, and ethnicity information. Amongst the challenges in data privacy is to share
selected personal data and permit the processing thereof, while inhibiting unwanted or
unlawful use, including further dissemination. The IT and information security disci-
plines have made various attempts at designing and applying software, hardware, proce-
dures, policies and human resources in orderto address this issue. National and regional
privacy protection laws are to a large extent based on the OECD data privacy principles
In the proceedings of the 7th International Conference on Trust, Privacy & Security in Digital
Business (TRUSTBUS’10), Bilbao, August-September, 2010.

defined in 1980 [21], e.g. the EU Data Protection Directive [13]. The legal framework
for data protection has been adapted to take into account some of the changes in tech-
nology, but the constant technological change has been challenging to follow up. In
the 70s and 80s personal information was stored on mainframe computers, on punch
cards or on tape rolls with limited connectivity. The Internet only existed in the form of
the experimental ARPANET, and no commercial applications had been conceived. It is
natural that the principles defined by the OECD in 1980 reflected the computing infras-
tructure at that time, and the principles can be judged as relatively adequate from that
perspective. Since then, the legal framework has struggled in keeping up with changes
in the technology.
On the technological side, a long track of information security research exists. Their
focus is the development of privacy-enhancing technology (PET) in support of the -
mostly legally derived - requirements for personal information handling. A brief histor-
ical overview over privacy regulation and PET is given in [15]:
Starting in the 1970ies, regulatory regimes were put on computers and net-
works. Starting with government data processing, along the lines of computer-
ization of communication and workflows, explicit rules like the European Data
Protection Directive [7] have been put in place. With the adoption of Inter-
net and mobile telephony in society in the past decade, the privacy challenges
of information technology came to everyday life.The PET research perspective
focused to a certain degree on the legal foundations of privacyprotection, deter-
mined by constitutional and fundamental human rights that should be protected
using technology. This view is shown in an analysis of the PET vocabulary in
[18]. As rights are granted to individuals, much of the research has focused
on the user-side, e.g. visible in Pfitzmann/Hansen’s well-quoted terminology
paper [23]. The legal view is propagated into contemporary frameworks like
the Canadian [22] and Dutch [28] privacy legislation, which both define pri-
vacy audit schemes with detailed procedural definitions and responsibilities,
but neglect to provide a decision support method for managers that would en-
able them to make feasible decisions about privacy needs based on quantifiable
risks. Most of these criteria, including schemes like Datenschutz-G¨utesiegel
[16], provide checklists with questions for the auditors. They inherently call
for competent and well-paid external experts when they are used by a com-
pany, but are rarely based on empirical data or metrics. The PET award winning
taxonomy of privacy [26] is very visibly structured along the legal view on pri-
vacy.
Many assumptions underlying traditional PETs (Privacy Enhancing Technologies) are
no longer valid. Users have little control overinformationthey providetoservice providers,
which e.g. exposes them to various profiling risks [14]. M. Peter Hustinx, the European
Data Protection Supervisor, said in his keynote talk at NordSec 2009
4
that the EU and
OECD have recognized the erosion of the adequacy of the classic privacy principles af-
ter the emergence of the Internet. In 2009, these organizations therefore have initiated a
4
”Privacy in the Internet Age” URL: NordSec2009.unik.no

process for defining new and more adequate privacy principles for networked environ-
ments. Similarly, in a keynote speech at the Data Protection Day on 28 January 2010 at
the European Parliament, Brussels, Viviane Reding
5
expressed the intention to present a
legislative proposal for reforming the European Privacy Directive before the end of the
year (2010), and launched the concept of ”privacy by design” [24] which specifies that
privacy requirements must always be included in the design of new Internet technolo-
gies. In her speech she said that the new legal framework should address new challenges
of the information age, such as globalisation, development of information technologies,
the Internet, online social networking, e-commerce, cloud computing, video surveil-
lance, behavioural advertising, data security breaches, etc.
Privacy policies are sometimes used by organizations that collect and process per-
sonal information. However, users often pay little or no attention to these privacy poli-
cies, and once the personal information has been collected, it is practically impossible
to verify that the specified privacy policies are being adhered to. There is also scien-
tific evidence that user-side reading of privacy policies is in conflict with basic market
economic principles [30].
It can also be mentioned that the protection of personal data is sometimes in conflict
with other interests of individuals, organizations or society at large. Several occasions,
for example the ’war on terrorism’, showed that the European Union delivers passenger
flight databases, SWIFT financial transactions, and telecommunications data to authori-
ties outside the EU legislation. In such cases, no consent is necessary, if such disclosure
is lawful under the applicable law.
From this brief survey it seems timely to rethink how information privacy should be
defined and enforced in the online environment. This paper looks at the inadequacy of
the current approach to information privacy protection, and proposes a new approach
based on attaching policy metadata to personal information. By requiring that the meta-
data follows personal information, it becomes easy to verify whether the policies are
being adhered to. In addition, one should consider standardizing privacy policies in the
form of a limited set of easily recognizable rules to improve the usability of privacy
protection.
2 The Inadequacy of the Current Approach
2.1 Business decision-making and privacy technology
For any deployment of PET into information systems, the effectiveness of the PET
measure against threats is important [15]. While PET cost of installation and opera-
tion could be assessed with experiments, the efficiency of their deployment remains
unknown. In the computer science field, several contributions provide information the-
oretic models for anonymity, identifiability or the linkability of data, e.g. in [27]or in
[10]. Both papers build mathematical models that are rather impractical for usage in
the evaluation of large-scale information systems. Another suggestion comes from an
article on intrusion detection by user context modeling [19], where the author tries to
identify attacks by classification of untypical user behavior. Such behavioral analysis
5
Member of the European Commission responsible for Information Society and Media Privacy

can be developed into a tool to measure effectiveness of PET. From some experiments
on profiling people with publicly available data from the Internet [9], one might try
to use profiling output as a measure of the quality of PET systems. But the definition
of the information that counts as a part of a profile, as well as the question of how to
distinguish leaked information from intentionally published personal information make
profiling a rather impractical metric. With these difficulties in measuring effectiveness
of PET, how will we judge efficiency? Also, for the deployment of PET on the business
side, or the acceptance of some extra effort by users adapting to PETs, there are more
questions to ask:
Which PET will remove or reduce a particular risk? At what cost will a particular
PET remove a particular risk?
How much effort (instruction, change of system usage habits, change of behavior,
self-control) had to be spent on the user-side for the PET to be effective?
Is there a cheaper or more convenient alternative on how to deal with a particular
risk instead of PET deployment?
2.2 Inadequacy of Technical Privacy Strategies
Public surveys indicate that privacy is a major concern for people using the Internet [6].
Privacy related complaints that are made to the US Federal Trade Commission include
complaints about unsolicited email, identity theft, harassing phone calls, and selling of
data to third parties [20]. One attempt to address privacy concerns and thereby increase
user trust in the Web is the W3C’s Platform for Privacy Preferences (P3P) Project [8].
P3P enables Web sites to express their privacy practices in a standardized, XML-based,
format that can be automatically interpreted by user agents such as a Web browser. The
aim is that discrepancies between a site’s practices and the user’s preferences can be
automatically flagged. Nine aspects of online privacy are covered by P3P,including five
that cover data being tracked by the site: who is collecting the data; what information
is being collected; for what purposes is it being collected; which information is being
shared with others; and who are the data recipients. Four topics explain the site’s inter-
nal privacy policies: can users make changes in how their data is used; how are disputes
resolved; what is the policy for retaining data; and where can the detailed policies be
found in a‘human readable’ form. It would be fair to say that P3P has been a failure
because users and industry have not adopted it.One of the reasons might be that P3P
is unable to guarantee or enforce the privacy claims made by Websites. Despite its po-
tential, detractors say that P3P does not go far enough to protect privacy. They believe
that the aim of privacy technology should be to enable people to transact anonymously
[11]. Private privacy service providers or anonymisers have been proposed [29]. One
example is iPrivacy, a New York based company that around 2002 professed on its Web
site, “not even iPrivacy will know the true identity of the people who use its service”.
To utilize the technology, users first had to download software from the Web site of a
company they trusted, for example a bank or credit card company. When they wished to
purchase a product online, they used the software to generate a one-offfictitious identity
(name, address and email address). Users were given the choice of collecting the goods
from their local post office (their post or zip code is the only part of the address which is

correct) or having the goods delivered by a delivery company or postal service that has
been sent a decoded address label. Originally the iPrivacy software generated a one-off
credit card number for each transaction. The credit card issuer matched the credit card
number it received from the merchant with the user’s real credit card number and then
authorized payment. However,this proved to be a major job for banks to integrate and
is no longer offered by iPrivacy. There are still other companies such as Orbiscom.com
and Cyota.com (acquired by RSA) that do offer one-off credit card numbers,but these
have captured limited use to date. Another type of privacy provider or infomediary is
emerging which sells aggregated buyer data to marketers, but keeps individual identify-
ing information private [29]. One example of this is Lumeria, a Berkley based company
that provides royalties to people who participate. In the Lumeria system, users down-
load free software that encrypts their profile and stores it on Lumeria’s servers. The
user accesses the Web via a Lumeria proxy server, which shields their identity from
merchants and marketing companies whilst enabling marketing material that matches
their profile to be sent to them. However, none of these initiatives have been a success,
and many privacy providers have gone out of business. This is quite understandable, as
the anonymity solutions result in significant additional complexity and cost.
2.3 Inadequacy of Specifying Privacy Policies
Many data controllers specify privacy policies that can be accessed from the interface
where personal information is being collected or where consent to do so is given. Such
policies are sometimes of 10 pages or longer, and can be written in a jargon that makes
them inaccessible for most people. Users are normally required to accept the policies by
ticking a box, which all but very few do in a semi-automatic fashion.Users quickly learn
that reading such policies is very frustrating. In addition, users who might be opposed to
some clauses in the policy faces the organization alone, although many others might be
of the same opinion. It is difficult for users to organize themselves and exercise pressure
on organizations to change their privacy policies, but both data protection authorities
and consumer ombudsmen have succeeded in pressuring some organizations to change
their policies. Once personal information has been collected, users have no practical
way of verifying whether the policies are being adhered to. In practice, it would also
be difficult to trace personal information back to the point where it was collected. Once
inside the network or system of an organization, it often becomes very difficult to trace
personal information back to the point of origin and the applicable privacy policy. This
is precisely where our proposal offers a solution, whereby the applicable privacy policy
always is referenced by the metadata associated with any personal information. This
will be explained in further detail below.
The privacy policy interpretation and specification troubles are illustrated in a sur-
vey article that provides a taxonomy of ’privacy-supporting’ and ’privacy-consuming’
privacy clauses from real policies [1]. The survey clearly shows that most privacy poli-
cies on web pages are carefully drafted to lure the consumers into accepting privacy-
consuming clauses.
A privacy policy may fulfill several different functions [4] (p.239). First, it can be
used to provide information about how personal data is processed by the data controller,
and such information may be mandatory according to the law. Second and somewhat

Citations
More filters
01 Sep 1996
TL;DR: The objectives of the European Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms
Abstract: (1) Whereas the objectives of the Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms;

792 citations

Journal ArticleDOI
24 Aug 2017
TL;DR: The evolution of the IoT, its various definitions, and some of its key application areas are discussed, both generally and in the context of these applications.
Abstract: The internet of things (IoT) is a technology that has the capacity to revolutionise the way that we live, in sectors ranging from transport to health, from entertainment to our interactions with go...

194 citations

Proceedings ArticleDOI
15 Oct 2012
TL;DR: In this paper, the authors present a solution to assist the user by providing a structured way to browse the policy content and by automatically assessing the completeness of a policy, i.e. the degree of coverage of privacy categories important to the user.
Abstract: A privacy policy is a legal document, used by websites to communicate how the personal data that they collect will be managed. By accepting it, the user agrees to release his data under the conditions stated by the policy. Privacy policies should provide enough information to enable users to make informed decisions. Privacy regulations support this by specifying what kind of information has to be provided. As privacy policies can be long and difficult to understand, users tend not to read them. Because of this, users generally agree with a policy without knowing what it states and whether aspects important to him are covered at all. In this paper we present a solution to assist the user by providing a structured way to browse the policy content and by automatically assessing the completeness of a policy, i.e. the degree of coverage of privacy categories important to the user. The privacy categories are extracted from privacy regulations, while text categorization and machine learning techniques are used to verify which categories are covered by a policy. The results show the feasibility of our approach; an automatic classifier, able to associate the right category to paragraphs of a policy with an accuracy approximating that obtainable by a human judge, can be effectively created.

75 citations

31 Mar 2015
TL;DR: The final author version and the galley proof are versions of the publication after peer review that features the final layout of the paper including the volume, issue and page numbers.
Abstract: • A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers.

32 citations


Cites background from "Privacy policy referencing"

  • ...However, this information is often ‘hidden’ in free text full of technical terms, that users typically refuse to read [8, 72]....

    [...]

Book ChapterDOI
12 Jun 2017
TL;DR: The concept of partial commitment and its possible applications from both the data subject and the data controller perspective in Big Data and Machine Learning are discussed.
Abstract: The concept of partial commitment is discussed in the context of personal privacy management in data science. Uncommitted, promiscuous or partially committed user’s data may either have a negative impact on model or data quality, or it may impose higher privacy compliance cost on data service providers. Many Big Data (BD) and Machine Learning (ML) scenarios involve the collection and processing of large volumes of person-related data. Data is gathered about many individuals as well as about many parameters in individuals. ML and BD both spend considerable resources on model building, learning, and data handling. It is therefore important to any BD/ML system that the input data trained and processed is of high quality, represents the use case, and is legally processes in the system. Additional cost is imposed by data protection regulation with transparency, revocation and correction rights for data subjects. Data subjects may, for several reasons, only partially accept a privacy policy, and chose to opt out, request data deletion or revoke their consent for data processing. This article discusses the concept of partial commitment and its possible applications from both the data subject and the data controller perspective in Big Data and Machine Learning.

5 citations

References
More filters
Journal ArticleDOI
TL;DR: A taxonomy of privacy violations can be found in this article, where the authors provide a framework for how the legal system can come to a better understanding of privacy problems and propose a taxonomy that focuses specifically on the different kinds of activities that impinge upon privacy.
Abstract: incantations of “privacy” are not nuanced enough to capture the problems involved. The 9/11 Commission Report, for example, recommends that, as government agencies engage in greater information sharing with each other and with businesses, they should “safeguard the privacy of individuals about whom information is shared.” But what does safeguarding “privacy” mean? Without an understanding of what the privacy problems are, how can privacy be addressed in a meaningful way? Many commentators have spoken of privacy as a unitary concept with a uniform value, which is unvarying across different situations. In contrast, I have argued that privacy violations involve a variety of types of harmful or problematic activities. Consider the following examples of activities typically referred to as privacy violations: 8 Judith Jarvis Thomson, The Right to Privacy, in PHILOSOPHICAL DIMENSIONS OF PRIVACY: AN ANTHOLOGY 272, 272 (Ferdinand David Schoeman ed., 1984). 9 See James Q. Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty, 113 YALE L.J. 1151, 1154 (2004) (“[T]he typical privacy article rests its case precisely on an appeal to its reader’s intuitions and anxieties about the evils of privacy violations.”). 10 NAT’L COMM’N ON TERRORIST ATTACKS UPON THE U.S., THE 9/11 COMMISSION REPORT 394 (2004). 11 Daniel J. Solove, Conceptualizing Privacy, 90 CAL. L. REV. 1087, 1130 (2002) [hereinafter Solove, Conceptualizing Privacy]. In contrast to attempts to conceptualize privacy by isolating one or more common “essential” or “core” characteristics, I concluded that there is no singular essence found in all “privacy” violations. See id. at 1095-99 (concluding that “the quest for a common denominator or essence . . . can sometimes lead to confusion”). 2006] A TAXONOMY OF PRIVACY 481 A newspaper reports the name of a rape victim. Reporters deceitfully gain entry to a person’s home and secretly photograph and record the person. New X-ray devices can see through people’s clothing, amounting to what some call a “virtual strip-search.” The government uses a thermal sensor device to detect heat patterns in a person’s home. A company markets a list of five million elderly incontinent women. Despite promising not to sell its members’ personal information to others, a company does so anyway. These violations are clearly not the same. Despite the wide-ranging body of law addressing privacy issues today, commentators often lament the law’s inability to adequately protect privacy. Courts and policymakers frequently have a singular view of privacy in mind when they assess whether or not an activity violates privacy. As a result, they either conflate distinct privacy problems despite significant differences or fail to recognize a problem entirely. Privacy problems are frequently misconstrued or inconsistently recognized in the law. The concept of “privacy” is far too vague to guide adjudication and lawmaking. How can privacy be addressed in a manner that is non-reductive and contextual, yet simultaneously useful in deciding cases and making sense of the multitude of privacy problems we face? In this Article, I provide a framework for how the legal system can come to a better understanding of privacy. I aim to develop a taxonomy that focuses more specifically on the different kinds of activities that impinge upon privacy. I endeavor to shift focus away from the vague term “privacy” 12 See Florida Star v. B.J.F., 491 U.S. 524, 527 (1989). 13 See Dietemann v. Time, Inc., 449 F.2d 245, 246 (9th Cir. 1971). 14 See Beyond X-ray Vision: Can Big Brother See Right Through Your Clothes?, DISCOVER, July 2002, at 24; Guy Gugliotta, Tech Companies See Market for Detection: Security Techniques Offer New Precision, WASH. POST, Sept. 28, 2001, at A8. 15 See Kyllo v. United States, 533 U.S. 27, 29 (2001). 16 See Standards for Privacy of Individually Identifiable Health Information, 65 Fed. Reg. 82,461, 82,467 (Dec. 28, 2000) (codified at 45 C.F.R. pts. 160 & 164). 17 See In re GeoCities, 127 F.T.C. 94, 97-98 (1999). 18 See, e.g., Joel R. Reidenberg, Privacy in the Information Economy: A Fortress or Frontier for Individual Rights?, 44 FED. COMM. L.J. 195, 208 (1992) (“The American legal system does not contain a comprehensive set of privacy rights or principles that collectively address the acquisition, storage, transmission, use and disclosure of personal information within the business community.”); Paul M. Schwartz, Privacy and Democracy in Cyberspace, 52 VAND. L. REV. 1609, 1611 (1999) (“At present, however, no successful standards, legal or otherwise, exist for limiting the collection and utilization of personal data in cyberspace.”). 482 UNIVERSITY OF PENNSYLVANIA LAW REVIEW [Vol. 154: 477 and toward the specific activities that pose privacy problems. Although various attempts at explicating the meaning of “privacy” have been made, few have attempted to identify privacy problems in a comprehensive and concrete manner. The most famous attempt was undertaken in 1960 by the legendary torts scholar William Prosser. He discerned four types of harmful activities redressed under the rubric of privacy: 1. Intrusion upon the plaintiff’s seclusion or solitude, or into his private affairs. 2. Public disclosure of embarrassing private facts about the plaintiff. 3. Publicity which places the plaintiff in a false light in the public eye. 4. Appropriation, for the defendant’s advantage, of the plaintiff’s name or likeness. Prosser’s great contribution was to synthesize the cases that emerged from Samuel Warren and Louis Brandeis’s famous law review article, The Right to Privacy. However, Prosser focused only on tort law. American privacy law is significantly more vast and complex, extending beyond torts to the constitutional “right to privacy,” Fourth Amendment law, evidentiary privileges, dozens of federal privacy statutes, and hundreds of state privacy statutes. 19 In 1967, Alan Westin identified four “basic states of individual privacy”: (1) solitude; (2) intimacy; (3) anonymity; and (4) reserve (“the creation of a psychological barrier against unwanted intrusion”). ALAN F. WESTIN, PRIVACY AND FREEDOM 31-32 (1967). These categories focus mostly on spatial distance and separateness; they fail to capture the many different dimensions of informational privacy. In 1992, Ken Gormley surveyed the law of privacy. See generally Ken Gormley, One Hundred Years of Privacy, 1992 WIS. L. REV. 1335. His categories-–tort privacy, Fourth Amendment privacy, First Amendment privacy, fundamentaldecision privacy, and state constitutional privacy-–are based on different areas of law rather than on a more systemic conceptual account of privacy. Id. at 1340. In 1998, Jerry Kang defined privacy as a union of three overlapping clusters of ideas: (1) physical space (“the extent to which an individual’s territorial solitude is shielded from invasion by unwanted objects or signals”); (2) choice (“an individual’s ability to make certain significant decisions without interference”); and (3) flow of personal information (“an individual’s control over the processing—i.e., the acquisition, disclosure, and use—of personal information”). Jerry Kang, Information Privacy in Cyberspace Transactions, 50 STAN. L. REV. 1193, 1202-03 (1998). Kang’s understanding of privacy is quite rich, but the breadth of the categories limits their usefulness in law. The same is true of the three categories identified by philosopher Judith DeCew: (1) “informational privacy”; (2) “accessibility privacy”; and (3) “expressive privacy.” JUDITH W. DECEW, IN PURSUIT OF PRIVACY: LAW, ETHICS, AND THE RISE OF TECHNOLOGY 75-77 (1997). 20 William L. Prosser, Privacy, 48 CAL. L. REV. 383, 389 (1960). 21 Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 HARV. L. REV. 193, 195-96 (1890). 22 See Anita L. Allen, Privacy in American Law, in PRIVACIES: PHILOSOPHICAL EVALUATIONS 19, 26 (Beate Rossler ed., 2004) (“American privacy law is impressive in its 2006] A TAXONOMY OF PRIVACY 483 The Freedom of Information Act contains two exemptions to protect against an “unwarranted invasion of personal privacy.” Numerous state public records laws also contain privacy exemptions. Many state constitutions contain provisions explicitly providing for a right to privacy. Moreover, Prosser wrote over forty years ago, before the breathtaking rise of the Information Age. New technologies have given rise to a panoply of different privacy problems, and many of them do not readily fit into Prosser’s four categories. Therefore, a new taxonomy to address privacy violations for contemporary times is sorely needed. The taxonomy I develop is an attempt to identify and understand the different kinds of socially recognized privacy violations, one that hopefully will enable courts and policymakers to better balance privacy against countervailing interests. The purpose of this taxonomy is to aid in the development of the law that addresses privacy. Although the primary focus will be on the law, this taxonomy is not simply an attempt to catalog existing laws, as was Prosser’s purpose. Rather, it is an attempt to understand various privacy harms and problems that have achieved a significant degree of social recognition. I will frequently use the law as a source for determining what privacy violations society recognizes. However, my aim is not simply to take stock of where the law currently stands today, but to provide a useful framework for its future development.

892 citations

Journal ArticleDOI
TL;DR: It is believed users' confidence in online transactions will increase when they are presented with meaningful information and choices about Web site privacy practices, and P3P is not a silver bullet; it is complemented by other technologies as well as regulatory and self-regulatory approaches to privacy.
Abstract: nternet users are concerned about the privacy of information they supply to Web sites, not only in terms of personal data, but information that Web sites may derive by tracking their online activities [7]. Many online privacy concerns arise because it is difficult for users to obtain information about actual Web site information practices. Few Web sites post privacy policies, 1 and even when they are posted, users do not always find them trustworthy or understandable. Thus, there is often a one-way mirror effect: Web sites ask users to provide personal information, but users have little knowledge about how their information will be used. Understandably, this lack of knowledge leads to confusion and mistrust. The WorldWide Web Consortium (W3C)'s Platform for Privacy Preferences Project (P3P) provides a framework for informed online interactions. The goal of P3P is to enable users to exercise preferences over Web site privacy practices at the Web sites. P3P applications will allow users to be informed about Web site practices , delegate decisions to their computer agent when they wish, and tailor relationships with specific sites. We believe users' confidence in online transactions will increase when they are presented with meaningful information and choices about Web site privacy practices. P3P is not a silver bullet; it is complemented by other technologies as well as regulatory and self-regulatory approaches to privacy. Some technologies have the ability to technically preclude practices that may be unacceptable to a user. For example, digital cash, anonymizers, and encryp-tion limit the information the recipient or eaves-droppers can collect during an interaction. Laws and industry guidelines codify and enforce expectations regarding information practices as the default or baseline for interactions. A compelling feature of P3P is that localized decision making enables flexibility in a medium that encompasses diverse preferences, cultural norms, and regulatory jurisdictions. However, for P3P to be effective, users must be willing and able to make meaningful decisions when presented with disclosures. This requires the existence of easy-to-use tools that allow P3P P Pr ri iv va ac cy y P Pr re ef fe er re en nc ce es s Web sites can bolster user confidence by clarifying their privacy practices upfront, allowing visitors to become active players in the decision-making process. 49 users to delegate much of the information processing and decision making to their computer agents when they wish, as well as a framework promoting the use …

857 citations

Book ChapterDOI
01 Jan 2001
TL;DR: In this article, the authors propose a set of terminology which is both expressive and precise, and define anonymity, unlinkability, unobservability, and pseudonymity (pseudonyms and digital pseudonyms, and their attributes).
Abstract: Based on the nomenclature of the early papers in the field, we propose a set of terminology which is both expressive and precise. More particularly, we define anonymity, unlinkability, unobservability, and pseudonymity (pseudonyms and digital pseudonyms, and their attributes). We hope that the adoption of this terminology might help to achieve better progress in the field by avoiding that each researcher invents a language of his/her own from scratch. Of course, each paper will need additional vocabulary, which might be added consistently to the terms defined here.

853 citations

01 Sep 1996
TL;DR: The objectives of the European Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms
Abstract: (1) Whereas the objectives of the Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms;

792 citations

01 Jan 2002
TL;DR: This document, along with its normative references, includes all the specification necessary for the implementation of interoperable P3P applications.
Abstract: This is the specification of the Platform for Privacy Preferences (P3P). This document, along with its normative references, includes all the specification necessary for the implementation of interoperable P3P applications.

771 citations

Frequently Asked Questions (2)
Q1. What have the authors contributed in "Privacy policy referencing" ?

This article describes a new approach called Privacy Policy Referencing, and outlines the technical and the complementary legal framework that needs to be established to support it. 

Remaining challenges, such as the international synchronization of policy templates, the reliable, auditable and secure implementation of personal data handling with policies, and the creation of the default policies and their supervision and archival, need to be further researched.