Nonprofit•Washington D.C., District of Columbia, United States•
About: Future of Privacy Forum is a nonprofit organization based out in Washington D.C., District of Columbia, United States. It is known for research contribution in the topics: Data Protection Act 1998 & Information privacy. The organization has 11 authors who have published 37 publications receiving 1003 citations.
••29 Oct 2019
TL;DR: The General Data Protection Regulation (GDPR) is grounded on a rich policy foundation and many of the law's provisions are a direct continuation of the European data protection regime set forth in the 1995 Data Protection Directive.
Abstract: The General Data Protection Regulation (GDPR) is grounded on a rich policy foundation. For the most part, its principles are not new. Many of the law's provisions are a direct continuation of the European data protection regime set forth in the 1995 Data Protection Directive (DPD). The DPD, in turn, came to harmonize then -existing European Member State data protection legislation, some of which dated to the 1970s and 1980s. France, for instance, passed its data protection law in 1978; Sweden legislated its Data Act in 1973.
TL;DR: In this article, the authors present a survey of the potential for both major innovations in the games industry as well as major risks for player privacy and trust, highlighting the need for developers to better understand player's privacy expectations.
Abstract: Advances in technology – particularly in the field of online communications – have revolutionized the way modern videogames are made and experienced. The evolution of many games from standalone products to constantly updating online services has all but upended the industry, creating new game features, new types of interactivity, and new monetization strategies. Mining player data has incredible potential to benefit both developers and players alike. Nevertheless, the shift to games as a service also means that players must put their faith in developers to consistently respect their personal privacy.Today, videogames collect and generate enormous amounts of information about their players, much of which may be considered highly sensitive. This data includes information relating to the real world, ranging from a player’s voice or physical appearance to his location or social network. It also includes detailed information from the player’s actions within the game world, which may be analyzed to create in-depth profiles of a player’s cognitive abilities and personality. Information collected within a game has many uses both within and outside the gaming ecosystem. Among other things, a player’s psychographic information can be used to create personalized gaming experiences, drive educational games, and dynamically adjust a game’s difficulty or mechanics to keep players engaged (and spending money). This paper surveys some of these applications, revealing the potential for both major innovations in the games industry as well as major risks for player privacy and trust.The game industry must confront and address the privacy issues raised by player data collection, lest it becomes the latest scandal to draw the ire of policymakers, parents, and players. This paper briefly surveys the many laws, agreements, and regulations that affect data collection and use by games, such as the Children’s Online Privacy Protection Act (COPPA), the Fair Credit Reporting Act (FCRA), intellectual property laws, international privacy law, the Federal Trade Commission’s Section 5 authority, and other relevant frameworks. Privacy guidelines for developers remain underdeveloped when it comes to fully capturing player’s privacy expectations. Rather than proposing strict rules or attempting to balance benefits to players versus harms, this paper simply aims to show where users are most likely to be unpleasantly surprised by data use. By better understanding player’s privacy expectations, developers will be better able to reduce surprise and foster player trust.
13 Jun 2018
TL;DR: A workshop on the deployment, content and design of the GDPR that brought together academics, practitioners, civil-society actors, and regulators from the EU and the US is reported on.
Abstract: The EU’s General Data Protection Regulation is poised to present major challenges in bridging the gap between law and technology. This paper reports on a workshop on the deployment, content and design of the GDPR that brought together academics, practitioners, civil-society actors, and regulators from the EU and the US. Discussions aimed at advancing current knowledge on the use of abstract legal terms in the context of applied technologies together with best practices following state of the art technologies. Five themes were discussed: state of the art, consent, de-identification, transparency, and development and deployment practices. Four traversal conflicts were identified, and research recommendations were outlined to reconcile these conflicts.
TL;DR: Global firms that gather, use or store GDPR personal data should consider the possibility that Controlled Linkable Data as described in this White Paper enables secondary uses of data while ensuring compliance with GDPR requirements.
Abstract: The new obligations imposed by the General Data Protection Regulation (GDPR) do not prohibit the use of personal data for analytics or other beneficial secondary uses. But they do require the adoption of new technical and organizational measures to protect that data. The GDPR explicitly points to pseudonymizing as one such measure that can help meet the requirements of several of its provisions. The GDPR further recognizes differing levels of de-identi cation in a way that provides incentives for organizations to adopt the optimal type and level of de-identification that can help them use personal data for bene cial purposes while meeting their compliance obligations and protecting the privacy of individuals. By enabling the use of “Controlled Linkable Data” (as described in this White Paper) that retains the utility of personal data while helping to meet organizations’ compliance obligations and to significantly reduce their risk of liability, Anonos® BigPrivacy® technology can help organizations navigate and meet these new GDPR requirements. Thus, Anonos BigPrivacy technology can ease regulatory burdens and be a key component of an overall GDPR compliance program. The body of this paper describes in detail the regulatory background, technological innovations, and practical applications of Controlled Linkable Data, leading to the maximization of data value and individual privacy in a GDPR-compliant manner. First, in Section III, we introduce the concept of Controlled Linkable Data in the context of the GDPR. Next, in Section IV, we describe the GDPR’s new requirements, focusing on the distinction between privacy by design and data protection by default, and noting that the former is merely a subset of the latter, making it insuf cient to satisfy the GDPR’s stringency. We also introduce the essential concept of Controlled Linkable Data. In Section V, we explain how Controlled Linkable Data enables a more powerful form of de-identification, one encouraged by the GDPR, but which has previously not been achievable by technical methods. This leads to the conclusion that “data protection over the full lifecycle of data by leveraging technical and organizational measures, including pseudonymisation, [ensures] that, by default, personal data are not made accessible without the individual’s intervention to an inde nite number of natural persons.” Next, Section VI analyzes numerous relevant sections of the GDPR (speci cally, Articles 5, 6, 11(2), 12(2), 15-22, 32-36, 40, 42, 82 and 88), showing how Controlled Linkable Data helps satisfy the specific GDPR requirements. Last, in light of this understanding of the requirements, limitations, exclusions and overall principles of the GDPR, Section VII explains the technical basis of Anonos BigPrivacy technology, how it implements Controlled Linkable Data, and how this solution addresses GDPR compliance concerns for all parties: data controllers, regulators and data subjects. Global firms that gather, use or store GDPR personal data should consider the possibility that Controlled Linkable Data as described in this White Paper enables secondary uses of data while ensuring compliance with GDPR requirements.
TL;DR: The data protection obligations on organizations that purchase and deploy products and services that collect and transmit data to a third-party provider are described and the similarities and differences (if any) in those obligations between those cases where the data is collected by a data processor and those where the Data Protection Regulation (GDPR) applies.
Abstract: Modern enterprises increasingly purchase and deploy products and services from third parties that collect data as part of providing the services. In this context, there is a common belief that the enterprise must be the “data controller” (in the terminology used in European data protection law), and the third-party provider must be a “data processor” acting on behalf of the enterprise. However, such a blanket rule is neither required by the law nor reflective of reality. There are many instances in which a third-party provider acts in whole, or in part, as a data controller. While the characterization of the third-party provider as a controller or a processor has certain legal ramifications, the difference may be less significant under the General Data Protection Regulation (GDPR) than under prior European data protection law. Legal compliance, risk mitigation, and appropriate protection of personal data can be achieved whether using products and services provided by data controllers or data processors; and there are pros and cons to each approach. This paper describes the data protection obligations on organizations that purchase and deploy products and services that collect and transmit data to a third-party provider. For each obligation, it will discuss the similarities and differences (if any) in those obligations between those cases where the data is collected by a data processor and those where the data is collected by a data controller. This paper focuses on those obligations imposed by the European GDPR; but because many of the principles and obligations occur in other privacy laws around the world, many of the conclusions can be generalized for global approaches to compliance.