scispace - formally typeset
Search or ask a question
Institution

Future of Privacy Forum

NonprofitWashington D.C., District of Columbia, United States
About: Future of Privacy Forum is a nonprofit organization based out in Washington D.C., District of Columbia, United States. It is known for research contribution in the topics: Data Protection Act 1998 & Information privacy. The organization has 11 authors who have published 37 publications receiving 1003 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors point out the usual confusions that Romanian courts make with regard to the use of the preliminary reference procedure and the relation between national and EU law, which will be reflected by the sections dedicated to the three main legal issues that were the catalyst for the judicial dialogue between the national courts and the CJEU: the European arrest warrant, the pollution tax for second-hand vehicles and the consumers protection provisions in relation with unfair terms in contracts.
Abstract: One of the greatest challenges of Romania’s accession to the European Union was faced by the judiciary. The courts – of first instance, last instance, along with the Supreme Court and the Constitutional Court, had just started to act as European courts for the purpose of the European Convention of Human Rights’ (ECHR) application and, beginning with 1 January 2007, they had to face a completely new and different supranational legal system, which came with a bigger and a more complex set of rules than the ECHR system. Some courts realized very soon that the key for this complex system to work is judicial dialogue, and the Jipa judgment of the Court of Justice of the European Union (CJEU) is a proof of that. Their example was followed by other courts with hesitating steps, while, at the same time, the judiciary struggled to get familiarized with the principle of primacy of EU law. The study will point out the usual confusions that Romanian courts make with regard to the use of the preliminary reference procedure and the relation between national and EU law, which will be reflected by the sections dedicated to the three main legal issues that were the catalyst for the judicial dialogue between the national courts and the CJEU: the European arrest warrant, the pollution tax for second-hand vehicles and the consumers protection provisions in relation with unfair terms in contracts. The study leads to the conclusion that Romanian courts are still confused with their status in the EU law system. This paper will also show that the Romanian Constitutional Court has contributed to this confusion, firstly by making insufficient steps to guarantee that EU law, lato sensu, is properly observed in the national legal system, even challenging, in certain cases, the principle of primacy of EU law, and secondly by completely refusing to address preliminary references to the CJEU.

2 citations

Posted Content
TL;DR: In this paper, the authors discuss the multiple purposes of privacy statements, including the legal obligations they are designed to fulfil, and argue that efforts to make privacy statements significantly shorter and simpler are optimizing for the one audience least likely to read them, rather than the audiences in the best position to police privacy statements and the practices they describe.
Abstract: Size matters. In fact, when it comes to privacy statements, there is an obsession with size. Much scholarship and commentary on privacy statements bemoans the fact that consumers rarely read them and places the blame on the length of those statements. The solution? Shorten and simplify! Proposals for standardized short-form notices, “nutrition label” notices, icons, and other attempts to replace long privacy statements abound. But none have proven to be a satisfactory substitute for a full, detailed description of what data an organization collects and how it is used, shared, retained, and protected. These short-form approaches inevitably leave out important details, gloss over critical nuances, and simplify technical information in a way that dramatically reduces transparency and accountability. This article discusses the multiple purposes of privacy statements, including the legal obligations they are designed to fulfil. It recognizes that there are many audiences for privacy statements, including consumers, regulators, policy makers, academics, researchers, investors, advocates, and journalists. And it argues that efforts to make privacy statements significantly shorter and simpler are optimizing for the one audience least likely to read them – consumers – rather than the audiences in the best position to police privacy statements and the practices they describe. Whatever the audience, having a detailed (long) privacy statement provides a single place where an interested reader can find the “full story” of the organization’s privacy practices. Unlike many alternate methods of providing notice, the detailed privacy statement makes the full range of privacy information available at any time, and to any person before, during or after the time an individual may be using the organization’s products or services. Long privacy statements also create organizational accountability. The exercise of drafting them requires organizations to do the detailed investigation to understand and document what data is being collected and how it is processed. And although few consumers other than a small number of highly-motivated individuals will read the statements, those who act on behalf of consumers do – including advocates, regulators, and journalists. It is mainly those individuals who ask the hard questions and are in a position to raise public awareness and create consequences for inadequate or problematic practices. And it is that kind of accountability that leads to positive change. To be clear, this article is not defending poorly-drafted privacy statements. Writing that is unclear, poorly organized, or needlessly complex or legalistic has no place in a privacy statement. Nor is this article suggesting that a privacy statement should be long simply for the sake of being long. A statement for a simple app that collects one type of information and uses it for one purpose can be quite short. But a privacy statement for an organization that offers a range of more complex, interrelated, and data-intensive services often must be quite long in order to provide all the relevant details. How long should a privacy statement be? A privacy statement should be as long as it needs to be in order to meet legal requirements and provide full descriptions of the pertinent data practices. Long privacy statements are often essential to achieving true transparency. But given that most consumers will not read them (regardless of the length), if we want to achieve transparency for all audiences, long privacy statements alone are not sufficient. This article should not be taken to suggest detailed privacy statements are the only way of creating transparency. And we should not write off consumers because they rarely read these privacy statements. Efforts should still be made to help consumers understand what is being done with their data and to give them meaningful control. Doing that well often involves measures in addition to a privacy statement, such as contextual privacy disclosures. But those measures almost always will be inadequate and incomplete unless provided in conjunction with a full, detailed privacy statement.

2 citations

Posted Content
TL;DR: In this article, the authors reveal the characteristics of data portability as a legal concept in the modern world of privacy and data protection; they focus on a few aspects of the reform proposed by the European Commission, highlighting the detailed provisions of the right to Data portability in the larger context of the data protection reform.
Abstract: The proposal for reform of the data protection legal framework, recently made public by the European Commission (EC), specifically enshrines a ‘right to data portability’, which is designed to reduce the difficulties for individuals to stay in control of their personal data, along with the provision of a ‘right to be forgotten’ and a ‘right to rectification’.The debates surrounding these privacy concerns, especially related to cloud computing (another ubiquitous concept), have not enjoyed a clear, unitary discourse, as IT developments have almost out-run the rhythm in which legal scholars were analysing recent threats to privacy, and especially internet privacy.This paper aims to reveal the characteristics of data portability as a legal concept in the modern world of privacy and data protection; it focuses on a few aspects of the reform proposed by the Commission, highlighting the detailed provisions of the right to data portability in the larger context of the reform.It also discusses data portability’s impact on competition and its links to international data transfers, as they will be regulated in the new EU data protection law. It concludes that data portability is a right of the data subject strongly connected with a fundamental right to the free development of human personality, and is also a function that cloud computing services worldwide will have to provide in order to increase users’, customers’ or consumers’ trust.

2 citations

Proceedings ArticleDOI
21 Jun 2021
Abstract: AI in My Life’ project will engage 500 Dublin teenagers from disadvantaged backgrounds in a 15-week (20-hour) co-created, interactive workshop series encouraging them to reflect on their experiences in a world shaped by Artificial Intelligence (AI), personal data processing and digital transformation. Students will be empowered to evaluate the ethical and privacy implications of AI in their lives, to protect their digital privacy and to activate STEM careers and university awareness. It extends the ‘DCU TY’ programme for innovative educational opportunities for Transition Year students from underrepresented communities in higher education. Privacy and cybersecurity researchers and public engagement professionals from the SFI Centres ADAPT1 and Lero2 will join experts from the Future of Privacy Forum3 and the INTEGRITY H20204 project to deliver the programme to the DCU Access5 22-school network. DCU Access has a mission of creating equality of access to third-level education for students from groups currently underrepresented in higher education. Each partner brings proven training activities in AI, ethics and privacy. A novel blending of material into a youth-driven narrative will be the subject of initial co-creation workshops and supported by pilot material delivery by undergraduate DCU Student Ambassadors. Train-the-trainer workshops and a toolkit for teachers will enable delivery. The material will use a blended approach (in person and online) for delivery during COVID-19. It will also enable wider use of the material developed. An external study of programme effectiveness will report on participants’: enhanced understanding of AI and its impact, improved data literacy skills in terms of their understanding of data privacy and security, empowerment to protect privacy, growth in confidence in participating in public discourse about STEM, increased propensity to consider STEM subjects at all levels, and greater capacity of teachers to facilitate STEM interventions. This paper introduces the project, presents more details about co-creation workshops that is a particular step in the proposed methodology and reports some preliminary results.

2 citations

Book ChapterDOI
TL;DR: The Cambridge Handbook of Consumer Privacy critically explores core issues that will determine how the future is shaped, and asks contributors to address as many parts and perspectives of the consumer privacy debate as possible.
Abstract: In the course of a single day, hundreds of companies collect massive amounts of information from individuals. Sometimes they obtain meaningful consent. Often, they use less than transparent means. By surfing the web, using a cell phone and apps, entering a store that provides Wi-Fi, driving a car, passing cameras on public streets, wearing a fitness device, watching a show on a smart TV or ordering a product from a connected home device, people share a steady stream of information with layers upon layers of hardware devices, software applications, and service providers. Almost every human activity, whether it is attending school or a workplace, seeking healthcare or shopping in a mall, driving on a highway or watching TV in the living room, leaves behind data trails that build up incrementally to create a virtual record of our daily lives. How companies, governments, and experts should use this data is among the most pressing global public policy concerns. Privacy issues, which are at the heart of many of the debates over data collection, analysis, and distribution, range extensively in both theory and practice. In some cases, conversations about privacy policy focus on marketing issues and the minutiae of a website’s privacy notices or an app’s settings. In other cases, the battle cry for privacy extends to diverse endeavors, such as the following: calls to impose accountability on the NSA’s counterterrorism mission; proposals for designing safe smart toys; plans for enabling individuals to scrub or modify digital records of their pasts; pleas to require database holders to inject noise into researchers’ queries to protect against leaks that disclose an individuals’ identity; plans to use crypto currencies or to prevent criminals and terrorists from abusing encryption tools; proposals for advancing medical research and improving public health without sacrificing patients’ control over their data; and ideas for how scientists can make their data more publicly available to facilitate replication of studies without, at the same time, inadvertently subjecting entire populations to prejudicial treatment, including discrimination. At a time when fake news influences political elections, new and contentious forms of machine to-machine communications are emerging, algorithmic decision-making is calling more of the shots in civic, corporate, and private affairs, and ruinous data breaches and ransomware attacks endanger everything from financial stability to patient care in hospitals, “privacy” has become a potent shorthand. Privacy is a boundary, a limiting principle, and a litmus test for identifying and adjudicating the delicate balance between the tremendous benefits and dizzying assortment of risks that insight-filled data offers. Consequently, far from what a first glance at the title of this volume might lead readers to expect, the Cambridge Handbook of Consumer Privacy critically explores core issues that will determine how the future is shaped. To do justice to the magnitude and complexity of these topics, we have asked contributors to address as many parts and perspectives of the consumer privacy debate as possible. How we, all of us, collectively grapple with these issues will determine the fate of technology and course of humanity.

2 citations


Authors
Network Information
Related Institutions (5)
Fortify Software
11 papers, 1.1K citations

84% related

Azul Systems
96 papers, 3.7K citations

83% related

Zero Knowledge Systems
11 papers, 2.4K citations

82% related

MCI Inc.
12 papers, 1.7K citations

81% related

Annenberg Center for Communication
11 papers, 1K citations

81% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20221
20212
20202
20193
20185
20174