Other affiliations: Ryerson University
Bio: Ann Cavoukian is an academic researcher from Information and Privacy Commissioner. The author has contributed to research in topics: Privacy by Design & Information privacy. The author has an hindex of 12, co-authored 30 publications receiving 923 citations. Previous affiliations of Ann Cavoukian include Ryerson University.
TL;DR: In this article, the authors argue that we must take great care not to sacrifice consumer privacy amidst an atmosphere of unbridled enthusiasm for electricity reform, and they advocate the adoption of Dr. Ann Cavoukian's conceptual model "SmartPrivacy" to prevent potential invasions of privacy while ensuring full functionality of the Smart Grid.
Abstract: The 2003 blackout in the northern and eastern U.S. and Canada which caused a $6 billion loss in economic revenue is one of many indicators that the current electrical grid is outdated. Not only must the grid become more reliable, it must also become more efficient, reduce its impact on the environment, incorporate alternative energy sources, allow for more consumer choices, and ensure cyber security. In effect, it must become “smart.” Significant investments in the billions of dollars are being made to lay the infrastructure of the future Smart Grid. However, the authors argue that we must take great care not to sacrifice consumer privacy amidst an atmosphere of unbridled enthusiasm for electricity reform. Information proliferation, lax controls and insufficient oversight of this information could lead to unprecedented invasions of consumer privacy. Smart meters and smart appliances will constitute a data explosion of intimate details of daily life, and it is not yet clear who will have access to this information beyond a person’s utility provider. The authors of this paper urge the adoption of Dr. Ann Cavoukian’s conceptual model ‘SmartPrivacy’ to prevent potential invasions of privacy while ensuring full functionality of the Smart Grid. SmartPrivacy represents a broad arsenal of protections, encapsulating everything necessary to ensure that all of the personal information held by an organization is appropriately managed. These include: Privacy by Design; law, regulation and independent oversight; accountability and transparency; market forces, education and awareness; audit and control; data security; and fair information practices. Each of these elements is important, but the concept of Privacy by Design represents its sine qua non. When applying SmartPrivacy to the Smart Grid, not only will the grid be able to, for example, become increasingly resistant to attack and natural disasters—it will be able to do so while also becoming increasingly resistant to data leakage and breaches of personal information. The authors conclude that SmartPrivacy must be built into the Smart Grid during its current nascent stage, allowing for both consumer control of electricity consumption and consumer control of their personal information, which must go hand in hand. Doing so will ensure that consumer confidence and trust is gained, and that their participation in the Smart Grid contributes to the vision of creating a more efficient and environmentally friendly electrical grid, as well as one that is protective of privacy. This will result in a positive-sum outcome, where both environmental efficiency and privacy can coexist.
TL;DR: Four fundamental technological approaches to help assure widespread and enduring online participation, confidence and trust in the information society are outlined.
Abstract: Informational self-determination refers to the right or ability of individuals to exercise personal control over the collection, use and disclosure of their personal data by others. The basis of modern privacy laws and practices around the world, informational privacy has become a challenging concept to protect and promote in a world of ubiquitous and unlimited data sharing and storage among organizations. The paper advocates a “user-centric” approach to managing personal data online. However, user-centricity can be problematic when the user—the data subject—is not directly involved in transactions involving the disclosure, collection, processing, and storage of their personal data. Identity data is increasingly being generated, used and stored entirely in the networked “Cloud”, where it is under control of third parties. The paper explores possible technology solutions to ensure that individuals will be able to exercise informational self-determination in an era of network grid computing, exponential data creation, ubiquitous surveillance and rampant online fraud. The paper describes typical “Web 2.0” use scenarios, suggests some technology building blocks to protect and promote informational privacy online, and concludes with a call to develop a privacy-respective information technology ecosystem for identity management. Specifically, the paper outlines four fundamental technological approaches to help assure widespread and enduring online participation, confidence and trust in the information society.
TL;DR: In November, 2009, a prominent group of privacy professionals, business leaders, information technology specialists, and academics gathered in Madrid to discuss how the next set of threats to privacy could best be addressed.
Abstract: In November, 2009, a prominent group of privacy professionals, business leaders, information technology specialists, and academics gathered in Madrid to discuss how the next set of threats to privacy could best be addressed. The event, Privacy by Design: The Definitive Workshop, was co-hosted by my office and that of the Israeli Law, Information and Technology Authority. It marked the latest step in a journey that I began in the 1990’s, when I first focused on enlisting the support of technologies that could enhance privacy. Back then, privacy protection relied primarily upon legislation and regulatory frameworks—in an effort to offer remedies for data breaches, after they had occurred. As information technology became increasingly interconnected and the volume of personal information collected began to explode, it became clear that a new way of thinking about privacy was needed. Privacy-Enhancing Technologies (PETs) paved the way for that new direction, highlighting how the universal principles of fair information practices could be reflected in information and communication technologies to achieve strong privacy protection. While the idea seemed radical at the time, it has been very gratifying over the past 15 years to see it come into widespread usage as part of the vocabulary of both privacy and information technology professionals. But the privacy landscape continues to evolve. So, like the technologies that shape and reshape the world in which we live, the privacy conversation must IDIS (2010) 3:247–251 DOI 10.1007/s12394-010-0062-y
TL;DR: In this paper, the essential elements of accountability identified by the Galway Accountability Project, with scholarship from the Centre for Information Policy Leadership at Hunton & Williams LLP, are discussed, as well as an example of an organizational control process that uses the principles to implement them.
Abstract: An accountability-based privacy governance model is one where organizations are charged with societal objectives, such as using personal information in a manner that maintains individual autonomy and which protects individuals from social, financial and physical harms, while leaving the actual mechanisms for achieving those objectives to the organization. This paper discusses the essential elements of accountability identified by the Galway Accountability Project, with scholarship from the Centre for Information Policy Leadership at Hunton & Williams LLP. Conceptual Privacy by Design principles are offered as criteria for building privacy and accountability into organizational information management practices. The authors then provide an example of an organizational control process that uses the principles to implement the essential elements. Initially developed in the ‘90s to advance privacy-enhancing information and communication technologies, Dr. Ann Cavoukian has since expanded the application of Privacy by Design principles to include business processes.
TL;DR: An interdisciplinary review of privacy-related research is provided in order to enable a more cohesive treatment and recommends that researchers be alert to an overarching macro model that is referred to as APCO (Antecedents → Privacy Concerns → Outcomes).
Abstract: To date, many important threads of information privacy research have developed, but these threads have not been woven together into a cohesive fabric. This paper provides an interdisciplinary review of privacy-related research in order to enable a more cohesive treatment. With a sample of 320 privacy articles and 128 books and book sections, we classify previous literature in two ways: (1) using an ethics-based nomenclature of normative, purely descriptive, and empirically descriptive, and (2) based on their level of analysis: individual, group, organizational, and societal. Based upon our analyses via these two classification approaches, we identify three major areas in which previous research contributions reside: the conceptualization of information privacy, the relationship between information privacy and other constructs, and the contextual nature of these relationships. As we consider these major areas, we draw three overarching conclusions. First, there are many theoretical developments in the body of normative and purely descriptive studies that have not been addressed in empirical research on privacy. Rigorous studies that either trace processes associated with, or test implied assertions from, these value-laden arguments could add great value. Second, some of the levels of analysis have received less attention in certain contexts than have others in the research to date. Future empirical studies-both positivist and interpretive--could profitably be targeted to these under-researched levels of analysis. Third, positivist empirical studies will add the greatest value if they focus on antecedents to privacy concerns and on actual outcomes. In that light, we recommend that researchers be alert to an overarching macro model that we term APCO (Antecedents → Privacy Concerns → Outcomes).
••30 Nov 2010
TL;DR: This paper assesses how security, trust and privacy issues occur in the context of cloud computing and discusses ways in which they may be addressed.
Abstract: Cloud computing is an emerging paradigm for large scale infrastructures. It has the advantage of reducing cost by sharing computing and storage resources, combined with an on-demand provisioning mechanism relying on a pay-per-use business model. These new features have a direct impact on the budgeting of IT budgeting but also affect traditional security, trust and privacy mechanisms. Many of these mechanisms are no longer adequate, but need to be rethought to fit this new paradigm. In this paper we assess how security, trust and privacy issues occur in the context of cloud computing and discuss ways in which they may be addressed.
TL;DR: In order to build a reliable smart grid, an overview of relevant cyber security and privacy issues is presented and several potential research fields are discussed at the end of this paper.
Abstract: Smart grid is a promising power delivery infrastructure integrated with communication and information technologies. Its bi-directional communication and electricity flow enable both utilities and customers to monitor, predict, and manage energy usage. It also advances energy and environmental sustainability through the integration of vast distributed energy resources. Deploying such a green electric system has enormous and far-reaching economic and social benefits. Nevertheless, increased interconnection and integration also introduce cyber-vulnerabilities into the grid. Failure to address these problems will hinder the modernization of the existing power system. In order to build a reliable smart grid, an overview of relevant cyber security and privacy issues is presented. Based on current literatures, several potential research fields are discussed at the end of this paper.
02 Feb 2012
TL;DR: In this paper, a private stream aggregation (PSA) system is proposed to contribute a user's data to a data aggregator without compromising the user's privacy, where the aggregator can decrypt an aggregate value without decrypting individual data values associated with the set of users, and without interacting with the users while decrypting the aggregate value.
Abstract: A private stream aggregation (PSA) system contributes a user's data to a data aggregator without compromising the user's privacy. The system can begin by determining (302) a private key for a local user in a set of users, wherein the sum of the private keys associated with the set of users and the data aggregator is equal to zero. The system also selects a set of data values associated with the local user. Then, the system encrypts individual data values in the set based in part on the private key to produce a set of encrypted data values, thereby allowing the data aggregator to decrypt an aggregate value across the set of users without decrypting individual data values associated with the set of users, and without interacting with the set of users while decrypting the aggregate value. The system also sends (308) the set of encrypted data values to the data aggregator.
TL;DR: An overview of current and next-generation methods for federated, secure and privacy-preserving artificial intelligence with a focus on medical imaging applications, alongside potential attack vectors and future prospects in medical imaging and beyond are presented.
Abstract: The broad application of artificial intelligence techniques in medicine is currently hindered by limited dataset availability for algorithm training and validation, due to the absence of standardized electronic medical records, and strict legal and ethical requirements to protect patient privacy. In medical imaging, harmonized data exchange formats such as Digital Imaging and Communication in Medicine and electronic data storage are the standard, partially addressing the first issue, but the requirements for privacy preservation are equally strict. To prevent patient privacy compromise while promoting scientific research on large datasets that aims to improve patient care, the implementation of technical solutions to simultaneously address the demands for data protection and utilization is mandatory. Here we present an overview of current and next-generation methods for federated, secure and privacy-preserving artificial intelligence with a focus on medical imaging applications, alongside potential attack vectors and future prospects in medical imaging and beyond. Medical imaging data is often subject to privacy and intellectual property restrictions. AI techniques can help out by offering tools like federated learning to bridge the gap between personal data protection and data utilisation for research and clinical routine, but these tools need to be secure.