scispace - formally typeset
Search or ask a question
Topic

Information privacy

About: Information privacy is a research topic. Over the lifetime, 25412 publications have been published within this topic receiving 579611 citations. The topic is also known as: data privacy & data protection.


Papers
More filters
Journal ArticleDOI
TL;DR: Professor Solove illustrates that conceptualizing the problem with the Kafka metaphor has profound implications for the law of information privacy as well as which legal approaches are taken to solve the problem.
Abstract: Journalists, politicians, jurists, and legal academics often describe the privacy problem created by the collection and use of personal information through computer databases and the Internet with the metaphor of Big Brother - the totalitarian government portrayed in George Orwell's Nineteen Eighty-Four. Professor Solove argues that this is the wrong metaphor. The Big Brother metaphor as well as much of the law that protects privacy emerges from a longstanding paradigm for conceptualizing privacy problems. Under this paradigm, privacy is invaded by uncovering one's hidden world, by surveillance, and by the disclosure of concealed information. The harm caused by such invasions consists of inhibition, self-censorship, embarrassment, and damage to one's reputation. Privacy law has developed with this paradigm in mind, and consequently, it has failed to adapt to grapple effectively with the database problem. Professor Solove argues that the Big Brother metaphor merely reinforces this paradigm and that the problem is better captured by Franz Kafka's The Trial. Understood with the Kafka metaphor, the problem is the powerlessness, vulnerability, and dehumanization created by the assembly of dossiers of personal information where individuals lack any meaningful form of participation in the collection and use of their information. Professor Solove illustrates that conceptualizing the problem with the Kafka metaphor has profound implications for the law of information privacy as well as which legal approaches are taken to solve the problem.

197 citations

Journal ArticleDOI
01 Apr 2011
TL;DR: It is shown that good private social recommendations are feasible only for a small subset of the users in the social network or for a lenient setting of privacy parameters, and lower bounds are proved on the minimum loss in utility for any recommendation algorithm that is differentially private.
Abstract: With the recent surge of social networks such as Facebook, new forms of recommendations have become possible -- recommendations that rely on one's social connections in order to make personalized recommendations of ads, content, products, and people. Since recommendations may use sensitive information, it is speculated that these recommendations are associated with privacy risks. The main contribution of this work is in formalizing trade-offs between accuracy and privacy of personalized social recommendations.We study whether "social recommendations", or recommendations that are solely based on a user's social network, can be made without disclosing sensitive links in the social graph. More precisely, we quantify the loss in utility when existing recommendation algorithms are modified to satisfy a strong notion of privacy, called differential privacy. We prove lower bounds on the minimum loss in utility for any recommendation algorithm that is differentially private. We then adapt two privacy preserving algorithms from the differential privacy literature to the problem of social recommendations, and analyze their performance in comparison to our lower bounds, both analytically and experimentally. We show that good private social recommendations are feasible only for a small subset of the users in the social network or for a lenient setting of privacy parameters.

197 citations

Book
07 Jun 2004
TL;DR: In this article, the authors discuss privacy and safety in electronic communications and DNA testing and individual rights in the context of cyber-spacespace and democracy, and conclude that "Particularistic Obligations Justified".
Abstract: Introduction. 1. Are Particularistic Obligations Justified?. 2. Privacy as an Obligation. 3. Children and Free Speech. 4. Privacy and Safety in Electronic Communications. 5. DNA Testing and Individual Rights. 6. What is Political?. 7. On Ending Nationalism8. Cyberspace and Democracy

196 citations

Proceedings Article
01 Jan 2017
TL;DR: G gaps in threat models arising from limited technical understanding of smart homes, awareness of some security issues but limited concern, ad hoc mitigation strategies, and a mismatch between the concerns and power of the smart home administrator and other people in the home are identified.
Abstract: The Internet of Things is becoming increasingly widespread in home environments. Consumers are transforming their homes into smart homes, with internet-connected sensors, lights, appliances, and locks, controlled by voice or other user-defined automations. Security experts have identified concerns with IoT and smart homes, including privacy risks as well as vulnerable and unreliable devices. These concerns are supported by recent high profile attacks, such as the Mirai DDoS attacks. However, little work has studied the security and privacy concerns of end users who actually set up and interact with today’s smart homes. To bridge this gap, we conduct semi-structured interviews with fifteen people living in smart homes (twelve smart home administrators and three other residents) to learn about how they use their smart homes, and to understand their security and privacy related attitudes, expectations, and actions. Among other findings, we identify gaps in threat models arising from limited technical understanding of smart homes, awareness of some security issues but limited concern, ad hoc mitigation strategies, and a mismatch between the concerns and power of the smart home administrator and other people in the home. From these and other findings, we distill recommendations for smart home technology designers and future research.

196 citations

Journal ArticleDOI
TL;DR: Progress is described on differentially private machine learning and signal processing for privacy-preserving data analysis algorithms for signal processing.
Abstract: Private companies, government entities, and institutions such as hospitals routinely gather vast amounts of digitized personal information about the individuals who are their customers, clients, or patients. Much of this information is private or sensitive, and a key technological challenge for the future is how to design systems and processing techniques for drawing inferences from this large-scale data while maintaining the privacy and security of the data and individual identities. Individuals are often willing to share data, especially for purposes such as public health, but they expect that their identity or the fact of their participation will not be disclosed. In recent years, there have been a number of privacy models and privacy-preserving data analysis algorithms to answer these challenges. In this article, we will describe the progress made on differentially private machine learning and signal processing.

196 citations


Network Information
Related Topics (5)
The Internet
213.2K papers, 3.8M citations
88% related
Server
79.5K papers, 1.4M citations
85% related
Encryption
98.3K papers, 1.4M citations
84% related
Social network
42.9K papers, 1.5M citations
83% related
Wireless network
122.5K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023562
20221,226
20211,535
20201,634
20191,255
20181,277