TL;DR: This article describes a new approach called Privacy Policy Referencing, and outlines the technical and the complementary legal framework that needs to be established to support it.
Abstract: Data protection legislation was originally defined for a context where personal information is mostly stored on centralized servers with limited connectivity and openness to 3rd party access. Currently, servers are connected to the Internet, where a large amount of personal information is continuously being exchanged as part of application transactions. This is very different from the original context of data protection regulation. Even though there are rather strict data protection laws in an increasing number of countries, it is in practice rather challenging to ensure an adequate protection for personal data that is communicated on-line. The enforcement of privacy legislation and policies therefore might require a technological basis, which is integrated with adequate amendments to the legal framework. This article describes a new approach called Privacy Policy Referencing, and outlines the technical and the complementary legal framework that needs to be established to support it.
Data protection law regulates the processing of information related to individual persons, including their collection, storage, dissemination etc.
A brief historical overview over privacy regulation and PET is given in [15]: Starting in the 1970ies, regulatory regimes were put on computers and networks.
Starting with government data processing, along the lines of computerization of communication and workflows, explicit rules like the European Data Protection Directive [7] have been put in place.
Most of these criteria, including schemes like Datenschutz-Gütesiegel [16], provide checklists with questions for the auditors.
Privacy policies are sometimes used by organizations that collect and process personal information.
2.1 Business decision-making and privacy technology
For any deployment of PET into information systems, the effectiveness of the PET measure against threats is important [15].
In the computer science field, several contributions provide information theoretic models for anonymity, identifiability or the linkability of data, e.g. in [27]or in [10].
Both papers build mathematical models that are rather impractical for usage in the evaluation of large-scale information systems.
2.2 Inadequacy of Technical Privacy Strategies
Public surveys indicate that privacy is a major concern for people using the Internet [6].
One attempt to address privacy concerns and thereby increase user trust in the Web is the W3C’s Platform for Privacy Preferences (P3P) Project [8].
Detractors say that P3P does not go far enough to protect privacy.
Originally the iPrivacy software generated a one-off credit card number for each transaction.
The user accesses the Web via a Lumeria proxy server, which shields their identity from merchants and marketing companies whilst enabling marketing material that matches their profile to be sent to them.
2.3 Inadequacy of Specifying Privacy Policies
Many data controllers specify privacy policies that can be accessed from the interface where personal information is being collected or where consent to do so is given.
Users are normally required to accept the policies by ticking a box, which all but very few do in a semi-automatic fashion.
This will be explained in further detail below.
A privacy policy may fulfill several different functions [4] (p.239).
Nevertheless, for most people it is challenging to assess whether they should consent to the processing of their personal data under a given privacy policy, particularly if it is ambiguous and permits a wide range of forms of processing personal data, possibly exceeding what would be permitted under the applicable data protection law.
3 An Infrastructure for Privacy Policy Referencing
The fundamental principle of Privacy Policy Referencing is that all personal information must be tagged or associated with metadata that relates it to the applicable privacy policy, and possibly to the point and time of collection.
This would enable users or authorities to audit systems and applications where personal information is being processed, and to determine whether they adhere to applicable privacy policies.
By making it mandatory to always have policy metadata associated with personal information, it becomes a universal principle for referencing privacy policies.
Their approach, however, assumes that the underlying hardware platform, and the software running on it, are so-called trustworthy systems based on the Trusted Computing specification.
3.1 The Technical Framework
Privacy policy metadata will require the definition of a common metadata language in XML style.
This means that each privacy policy must be uniquely identifiable, so that organizations must keep records of such identifiable privacy policies that have been used.
The metadata does not need to contain any additional personal information, because that would be irrelevant for potential audits of policy adherence.
The organizations must then find a solution for associating the personal information with metadata stored elsewhere.
It can be noted that their scheme has similarities with the scheme for electronic signature policies described in [25] where a specific signature policy has a globally unique reference which is bound to the signature by the signer as part of the signature calculation.
3.2 The Policy Framework
It is very difficult for users to understand privacy policies when each organization specifies a different policy and when typical policies are 10 pages or more.
The combination of rules into specific profiles can be denoted as the PRP (Privacy Rules Profile) framework.
In international trade law, the Incoterms [17] offer a widely used catalogue of specific contract terms that can be quoted when buying or selling goods.
A number of IPR licensing issues regarding open source software can be easily regulated by referring to specific predefined licenses.
By having limited set of standardized policies, it would be possible for users to become educated and familiar with what the respective policies actually mean, and the level of protection they provide.
3.3 The Management Framework
Organizations would need to manage their privacy policies according to strict criteria, and define a way guaranteeing their integrity and authenticity.
This can e.g. be achieved by letting independent third parties sign hashes of each particular policy or policy profile which would allow changes in policies or profiles to be noticed, or to deposit the privacy policies with independent third parties such as national information commissioners and data protection inspectorates.
Privacy policy repositories that are suitable for long-term archival of verified policies might me necessary with respect to long-term legal validity.
Organizations will also need to define processes for creating metadata and to adapt applications where personal information is being processed so that the metadata can be appropriately handled during storage, transfer and processing.
3.4 The Legal Framework
This approach could also be complemented with respective changes to the legal framework as e.g. through [24], in order to provide incentives for its adoption.
This could be seen as an extension of the purpose specification principle mentioned above, according to which personal data can only be collected for specified, explicit and legitimate purposes and not further processed for other purposes.
An additional element might be that that certain classes of privacy policies could be mandatorily deposited with a respective national or regional data protection authority, and that the metadata points to the deposited copies of the privacy policies, who might also assess a policy’s compliance with the applicable law.
This might enhance the possibilities for auditors to review data controllers with regard to the personal information that that they process.
In case the audited organization is different from the organization specified in the metadata, the auditor will have an indication that the privacy policy has been infringed.
4 Conclusion
The current approach to ensuring personal information privacy on the Internet is ineffective in providing privacy protection in the age of distributed, networked services.
The approach described in this paper changes the way privacy policies can be specified by service providers, and compliance be verified by auditors or users.
By providing certified template policies, users gain oversight of policies that have been verified.
At the same time, auditors can verify system states against policy claims.
Introducing this framework might also require the introduction of incentives, for example by making it mandatory to include privacy policy metadata with personal information.
TL;DR: The objectives of the European Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms
Abstract: (1) Whereas the objectives of the Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms;
TL;DR: The evolution of the IoT, its various definitions, and some of its key application areas are discussed, both generally and in the context of these applications.
Abstract: The internet of things (IoT) is a technology that has the capacity to revolutionise the way that we live, in sectors ranging from transport to health, from entertainment to our interactions with go...
TL;DR: In this paper, the authors present a solution to assist the user by providing a structured way to browse the policy content and by automatically assessing the completeness of a policy, i.e. the degree of coverage of privacy categories important to the user.
Abstract: A privacy policy is a legal document, used by websites to communicate how the personal data that they collect will be managed. By accepting it, the user agrees to release his data under the conditions stated by the policy. Privacy policies should provide enough information to enable users to make informed decisions. Privacy regulations support this by specifying what kind of information has to be provided. As privacy policies can be long and difficult to understand, users tend not to read them. Because of this, users generally agree with a policy without knowing what it states and whether aspects important to him are covered at all. In this paper we present a solution to assist the user by providing a structured way to browse the policy content and by automatically assessing the completeness of a policy, i.e. the degree of coverage of privacy categories important to the user. The privacy categories are extracted from privacy regulations, while text categorization and machine learning techniques are used to verify which categories are covered by a policy. The results show the feasibility of our approach; an automatic classifier, able to associate the right category to paragraphs of a policy with an accuracy approximating that obtainable by a human judge, can be effectively created.
TL;DR: The final author version and the galley proof are versions of the publication after peer review that features the final layout of the paper including the volume, issue and page numbers.
Abstract: • A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers.
32 citations
Cites background from "Privacy policy referencing"
...However, this information is often ‘hidden’ in free text full of technical terms, that users typically refuse to read [8, 72]....
TL;DR: The concept of partial commitment and its possible applications from both the data subject and the data controller perspective in Big Data and Machine Learning are discussed.
Abstract: The concept of partial commitment is discussed in the context of personal privacy management in data science. Uncommitted, promiscuous or partially committed user’s data may either have a negative impact on model or data quality, or it may impose higher privacy compliance cost on data service providers. Many Big Data (BD) and Machine Learning (ML) scenarios involve the collection and processing of large volumes of person-related data. Data is gathered about many individuals as well as about many parameters in individuals. ML and BD both spend considerable resources on model building, learning, and data handling. It is therefore important to any BD/ML system that the input data trained and processed is of high quality, represents the use case, and is legally processes in the system. Additional cost is imposed by data protection regulation with transparency, revocation and correction rights for data subjects. Data subjects may, for several reasons, only partially accept a privacy policy, and chose to opt out, request data deletion or revoke their consent for data processing. This article discusses the concept of partial commitment and its possible applications from both the data subject and the data controller perspective in Big Data and Machine Learning.
TL;DR: A taxonomy of privacy violations can be found in this article, where the authors provide a framework for how the legal system can come to a better understanding of privacy problems and propose a taxonomy that focuses specifically on the different kinds of activities that impinge upon privacy.
Abstract: incantations of “privacy” are not nuanced enough to capture the problems involved. The 9/11 Commission Report, for example, recommends that, as government agencies engage in greater information sharing with each other and with businesses, they should “safeguard the privacy of individuals about whom information is shared.” But what does safeguarding “privacy” mean? Without an understanding of what the privacy problems are, how can privacy be addressed in a meaningful way? Many commentators have spoken of privacy as a unitary concept with a uniform value, which is unvarying across different situations. In contrast, I have argued that privacy violations involve a variety of types of harmful or problematic activities. Consider the following examples of activities typically referred to as privacy violations: 8 Judith Jarvis Thomson, The Right to Privacy, in PHILOSOPHICAL DIMENSIONS OF PRIVACY: AN ANTHOLOGY 272, 272 (Ferdinand David Schoeman ed., 1984). 9 See James Q. Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty, 113 YALE L.J. 1151, 1154 (2004) (“[T]he typical privacy article rests its case precisely on an appeal to its reader’s intuitions and anxieties about the evils of privacy violations.”). 10 NAT’L COMM’N ON TERRORIST ATTACKS UPON THE U.S., THE 9/11 COMMISSION REPORT 394 (2004). 11 Daniel J. Solove, Conceptualizing Privacy, 90 CAL. L. REV. 1087, 1130 (2002) [hereinafter Solove, Conceptualizing Privacy]. In contrast to attempts to conceptualize privacy by isolating one or more common “essential” or “core” characteristics, I concluded that there is no singular essence found in all “privacy” violations. See id. at 1095-99 (concluding that “the quest for a common denominator or essence . . . can sometimes lead to confusion”). 2006] A TAXONOMY OF PRIVACY 481 A newspaper reports the name of a rape victim. Reporters deceitfully gain entry to a person’s home and secretly photograph and record the person. New X-ray devices can see through people’s clothing, amounting to what some call a “virtual strip-search.” The government uses a thermal sensor device to detect heat patterns in a person’s home. A company markets a list of five million elderly incontinent women. Despite promising not to sell its members’ personal information to others, a company does so anyway. These violations are clearly not the same. Despite the wide-ranging body of law addressing privacy issues today, commentators often lament the law’s inability to adequately protect privacy. Courts and policymakers frequently have a singular view of privacy in mind when they assess whether or not an activity violates privacy. As a result, they either conflate distinct privacy problems despite significant differences or fail to recognize a problem entirely. Privacy problems are frequently misconstrued or inconsistently recognized in the law. The concept of “privacy” is far too vague to guide adjudication and lawmaking. How can privacy be addressed in a manner that is non-reductive and contextual, yet simultaneously useful in deciding cases and making sense of the multitude of privacy problems we face? In this Article, I provide a framework for how the legal system can come to a better understanding of privacy. I aim to develop a taxonomy that focuses more specifically on the different kinds of activities that impinge upon privacy. I endeavor to shift focus away from the vague term “privacy” 12 See Florida Star v. B.J.F., 491 U.S. 524, 527 (1989). 13 See Dietemann v. Time, Inc., 449 F.2d 245, 246 (9th Cir. 1971). 14 See Beyond X-ray Vision: Can Big Brother See Right Through Your Clothes?, DISCOVER, July 2002, at 24; Guy Gugliotta, Tech Companies See Market for Detection: Security Techniques Offer New Precision, WASH. POST, Sept. 28, 2001, at A8. 15 See Kyllo v. United States, 533 U.S. 27, 29 (2001). 16 See Standards for Privacy of Individually Identifiable Health Information, 65 Fed. Reg. 82,461, 82,467 (Dec. 28, 2000) (codified at 45 C.F.R. pts. 160 & 164). 17 See In re GeoCities, 127 F.T.C. 94, 97-98 (1999). 18 See, e.g., Joel R. Reidenberg, Privacy in the Information Economy: A Fortress or Frontier for Individual Rights?, 44 FED. COMM. L.J. 195, 208 (1992) (“The American legal system does not contain a comprehensive set of privacy rights or principles that collectively address the acquisition, storage, transmission, use and disclosure of personal information within the business community.”); Paul M. Schwartz, Privacy and Democracy in Cyberspace, 52 VAND. L. REV. 1609, 1611 (1999) (“At present, however, no successful standards, legal or otherwise, exist for limiting the collection and utilization of personal data in cyberspace.”). 482 UNIVERSITY OF PENNSYLVANIA LAW REVIEW [Vol. 154: 477 and toward the specific activities that pose privacy problems. Although various attempts at explicating the meaning of “privacy” have been made, few have attempted to identify privacy problems in a comprehensive and concrete manner. The most famous attempt was undertaken in 1960 by the legendary torts scholar William Prosser. He discerned four types of harmful activities redressed under the rubric of privacy: 1. Intrusion upon the plaintiff’s seclusion or solitude, or into his private affairs. 2. Public disclosure of embarrassing private facts about the plaintiff. 3. Publicity which places the plaintiff in a false light in the public eye. 4. Appropriation, for the defendant’s advantage, of the plaintiff’s name or likeness. Prosser’s great contribution was to synthesize the cases that emerged from Samuel Warren and Louis Brandeis’s famous law review article, The Right to Privacy. However, Prosser focused only on tort law. American privacy law is significantly more vast and complex, extending beyond torts to the constitutional “right to privacy,” Fourth Amendment law, evidentiary privileges, dozens of federal privacy statutes, and hundreds of state privacy statutes. 19 In 1967, Alan Westin identified four “basic states of individual privacy”: (1) solitude; (2) intimacy; (3) anonymity; and (4) reserve (“the creation of a psychological barrier against unwanted intrusion”). ALAN F. WESTIN, PRIVACY AND FREEDOM 31-32 (1967). These categories focus mostly on spatial distance and separateness; they fail to capture the many different dimensions of informational privacy. In 1992, Ken Gormley surveyed the law of privacy. See generally Ken Gormley, One Hundred Years of Privacy, 1992 WIS. L. REV. 1335. His categories-–tort privacy, Fourth Amendment privacy, First Amendment privacy, fundamentaldecision privacy, and state constitutional privacy-–are based on different areas of law rather than on a more systemic conceptual account of privacy. Id. at 1340. In 1998, Jerry Kang defined privacy as a union of three overlapping clusters of ideas: (1) physical space (“the extent to which an individual’s territorial solitude is shielded from invasion by unwanted objects or signals”); (2) choice (“an individual’s ability to make certain significant decisions without interference”); and (3) flow of personal information (“an individual’s control over the processing—i.e., the acquisition, disclosure, and use—of personal information”). Jerry Kang, Information Privacy in Cyberspace Transactions, 50 STAN. L. REV. 1193, 1202-03 (1998). Kang’s understanding of privacy is quite rich, but the breadth of the categories limits their usefulness in law. The same is true of the three categories identified by philosopher Judith DeCew: (1) “informational privacy”; (2) “accessibility privacy”; and (3) “expressive privacy.” JUDITH W. DECEW, IN PURSUIT OF PRIVACY: LAW, ETHICS, AND THE RISE OF TECHNOLOGY 75-77 (1997). 20 William L. Prosser, Privacy, 48 CAL. L. REV. 383, 389 (1960). 21 Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 HARV. L. REV. 193, 195-96 (1890). 22 See Anita L. Allen, Privacy in American Law, in PRIVACIES: PHILOSOPHICAL EVALUATIONS 19, 26 (Beate Rossler ed., 2004) (“American privacy law is impressive in its 2006] A TAXONOMY OF PRIVACY 483 The Freedom of Information Act contains two exemptions to protect against an “unwarranted invasion of personal privacy.” Numerous state public records laws also contain privacy exemptions. Many state constitutions contain provisions explicitly providing for a right to privacy. Moreover, Prosser wrote over forty years ago, before the breathtaking rise of the Information Age. New technologies have given rise to a panoply of different privacy problems, and many of them do not readily fit into Prosser’s four categories. Therefore, a new taxonomy to address privacy violations for contemporary times is sorely needed. The taxonomy I develop is an attempt to identify and understand the different kinds of socially recognized privacy violations, one that hopefully will enable courts and policymakers to better balance privacy against countervailing interests. The purpose of this taxonomy is to aid in the development of the law that addresses privacy. Although the primary focus will be on the law, this taxonomy is not simply an attempt to catalog existing laws, as was Prosser’s purpose. Rather, it is an attempt to understand various privacy harms and problems that have achieved a significant degree of social recognition. I will frequently use the law as a source for determining what privacy violations society recognizes. However, my aim is not simply to take stock of where the law currently stands today, but to provide a useful framework for its future development.
TL;DR: It is believed users' confidence in online transactions will increase when they are presented with meaningful information and choices about Web site privacy practices, and P3P is not a silver bullet; it is complemented by other technologies as well as regulatory and self-regulatory approaches to privacy.
Abstract: nternet users are concerned about the privacy of information they supply to Web sites, not only in terms of personal data, but information that Web sites may derive by tracking their online activities [7]. Many online privacy concerns arise because it is difficult for users to obtain information about actual Web site information practices. Few Web sites post privacy policies, 1 and even when they are posted, users do not always find them trustworthy or understandable. Thus, there is often a one-way mirror effect: Web sites ask users to provide personal information, but users have little knowledge about how their information will be used. Understandably, this lack of knowledge leads to confusion and mistrust. The WorldWide Web Consortium (W3C)'s Platform for Privacy Preferences Project (P3P) provides a framework for informed online interactions. The goal of P3P is to enable users to exercise preferences over Web site privacy practices at the Web sites. P3P applications will allow users to be informed about Web site practices , delegate decisions to their computer agent when they wish, and tailor relationships with specific sites. We believe users' confidence in online transactions will increase when they are presented with meaningful information and choices about Web site privacy practices. P3P is not a silver bullet; it is complemented by other technologies as well as regulatory and self-regulatory approaches to privacy. Some technologies have the ability to technically preclude practices that may be unacceptable to a user. For example, digital cash, anonymizers, and encryp-tion limit the information the recipient or eaves-droppers can collect during an interaction. Laws and industry guidelines codify and enforce expectations regarding information practices as the default or baseline for interactions. A compelling feature of P3P is that localized decision making enables flexibility in a medium that encompasses diverse preferences, cultural norms, and regulatory jurisdictions. However, for P3P to be effective, users must be willing and able to make meaningful decisions when presented with disclosures. This requires the existence of easy-to-use tools that allow P3P P Pr ri iv va ac cy y P Pr re ef fe er re en nc ce es s Web sites can bolster user confidence by clarifying their privacy practices upfront, allowing visitors to become active players in the decision-making process. 49 users to delegate much of the information processing and decision making to their computer agents when they wish, as well as a framework promoting the use …
TL;DR: In this article, the authors propose a set of terminology which is both expressive and precise, and define anonymity, unlinkability, unobservability, and pseudonymity (pseudonyms and digital pseudonyms, and their attributes).
Abstract: Based on the nomenclature of the early papers in the field, we propose a set of terminology which is both expressive and precise. More particularly, we define anonymity, unlinkability, unobservability, and pseudonymity (pseudonyms and digital pseudonyms, and their attributes).
We hope that the adoption of this terminology might help to achieve better progress in the field by avoiding that each researcher invents a language of his/her own from scratch. Of course, each paper will need additional vocabulary, which might be added consistently to the terms defined here.
TL;DR: The objectives of the European Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms
Abstract: (1) Whereas the objectives of the Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms;
TL;DR: This document, along with its normative references, includes all the specification necessary for the implementation of interoperable P3P applications.
Abstract: This is the specification of the Platform for Privacy Preferences (P3P). This document, along with its normative references, includes all the specification necessary for the implementation of interoperable P3P applications.
Q1. What have the authors contributed in "Privacy policy referencing" ?
This article describes a new approach called Privacy Policy Referencing, and outlines the technical and the complementary legal framework that needs to be established to support it.
Q2. What are the future works mentioned in the paper "Privacy policy referencing" ?
Remaining challenges, such as the international synchronization of policy templates, the reliable, auditable and secure implementation of personal data handling with policies, and the creation of the default policies and their supervision and archival, need to be further researched.