Topic
Information privacy
About: Information privacy is a research topic. Over the lifetime, 25412 publications have been published within this topic receiving 579611 citations. The topic is also known as: data privacy & data protection.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This work explores privatization algorithms that maintain class boundaries in a dataset to enable effective defect prediction from shared data while preserving privacy, and finds that CLIFF+MORPH performs significantly better than the state-of-the-art privacy algorithms.
Abstract: Background: Cross-company defect prediction (CCDP) is a field of study where an organization lacking enough local data can use data from other organizations for building defect predictors. To support CCDP, data must be shared. Such shared data must be privatized, but that privatization could severely damage the utility of the data. Aim: To enable effective defect prediction from shared data while preserving privacy. Method: We explore privatization algorithms that maintain class boundaries in a dataset. CLIFF is an instance pruner that deletes irrelevant examples. MORPH is a data mutator that moves the data a random distance, taking care not to cross class boundaries. CLIFF+MORPH are tested in a CCDP study among 10 defect datasets from the PROMISE data repository. Results: We find: 1) The CLIFFed+MORPHed algorithms provide more privacy than the state-of-the-art privacy algorithms; 2) in terms of utility measured by defect prediction, we find that CLIFF+MORPH performs significantly better. Conclusions: For the OO defect data studied here, data can be privatized and shared without a significant degradation in utility. To the best of our knowledge, this is the first published result where privatization does not compromise defect prediction.
134 citations
••
TL;DR: A privacy-preserving decentralized key-policy ABE scheme where each authority can issue secret keys to a user independently without knowing anything about his GID, which is the first decentralized ABE scheme with privacy- Preserving based on standard complexity assumptions.
Abstract: Decentralized attribute-based encryption (ABE) is a variant of a multiauthority ABE scheme where each authority can issue secret keys to the user independently without any cooperation and a central authority. This is in contrast to the previous constructions, where multiple authorities must be online and setup the system interactively, which is impractical. Hence, it is clear that a decentralized ABE scheme eliminates the heavy communication cost and the need for collaborative computation in the setup stage. Furthermore, every authority can join or leave the system freely without the necessity of reinitializing the system. In contemporary multiauthority ABE schemes, a user's secret keys from different authorities must be tied to his global identifier (GID) to resist the collusion attack. However, this will compromise the user's privacy. Multiple authorities can collaborate to trace the user by his GID, collect his attributes, then impersonate him. Therefore, constructing a decentralized ABE scheme with privacy-preserving remains a challenging research problem. In this paper, we propose a privacy-preserving decentralized key-policy ABE scheme where each authority can issue secret keys to a user independently without knowing anything about his GID. Therefore, even if multiple authorities are corrupted, they cannot collect the user's attributes by tracing his GID. Notably, our scheme only requires standard complexity assumptions (e.g., decisional bilinear Diffie-Hellman) and does not require any cooperation between the multiple authorities, in contrast to the previous comparable scheme that requires nonstandard complexity assumptions (e.g., q-decisional Diffie-Hellman inversion) and interactions among multiple authorities. To the best of our knowledge, it is the first decentralized ABE scheme with privacy-preserving based on standard complexity assumptions.
134 citations
••
01 Sep 2010TL;DR: A general impossibility result is given showing that a natural formalization of Dalenius’ goal cannot be achieved if the database is useful, and a variant of the result threatens the privacy even of someone not in the database.
Abstract: In 1977 Tore Dalenius articulated a desideratum for statistical databases: nothing about
an individual should be learnable from the database that cannot be learned without access to the
database. We give a general impossibility result showing that a natural formalization of Dalenius’
goal cannot be achieved if the database is useful. The key obstacle is the side information that
may be available to an adversary. Our results hold under very general conditions regarding the
database, the notion of privacy violation, and the notion of utility.
Contrary to intuition, a variant of the result threatens the privacy even of someone not in
the database. This state of affairs motivated the notion of differential privacy [15, 16], a strong
ad omnia privacy which, intuitively, captures the increased risk to one’s privacy incurred by
participating in a database.
133 citations
••
04 Apr 2009TL;DR: The timing of privacy information had a significant impact on how much of a premium users were willing to pay for privacy, and users paid more attention to privacy indicators when purchasing privacy-sensitive items than when purchasing items that raised minimal privacy concerns.
Abstract: Many commerce websites post privacy policies to address Internet shoppers' privacy concerns However, few users read or understand them Iconic privacy indicators may make privacy policies more accessible and easier for users to understand: in this paper, we examine whether the timing and placement of online privacy indicators impact Internet users' browsing and purchasing decisions We conducted a laboratory study where we controlled the placement of privacy information, the timing of its appearance, the privacy level of each website, and the price and items being purchased We found that the timing of privacy information had a significant impact on how much of a premium users were willing to pay for privacy We also found that timing had less impact when users were willing to examine multiple websites Finally, we found that users paid more attention to privacy indicators when purchasing privacy-sensitive items than when purchasing items that raised minimal privacy concerns
133 citations
••
13 Nov 2009TL;DR: The privacy framework in this paper can help to guide the researchers and developers in this community, and that the privacy properties provide a concrete foundation for privacy-sensitive systems and applications for mobile healthcare and home-care systems.
Abstract: In this paper, we consider the challenge of preserving patient privacy in the context of mobile healthcare and home-care systems, that is, the use of mobile computing and communications technologies in the delivery of healthcare or the provision of at-home medical care and assisted living. This paper makes three primary contributions. First, we compare existing privacy frameworks, identifying key differences and shortcomings. Second, we identify a privacy framework for mobile healthcare and home-care systems. Third, we extract a set of privacy properties intended for use by those who design systems and applications for mobile healthcare and home-care systems, linking them back to the privacy principles. Finally, we list several important research questions that the community should address. We hope that the privacy framework in this paper can help to guide the researchers and developers in this community, and that the privacy properties provide a concrete foundation for privacy-sensitive systems and applications for mobile healthcare and home-care systems.
133 citations