scispace - formally typeset
Search or ask a question

Showing papers on "Privacy software published in 2018"


Journal ArticleDOI
TL;DR: A location privacy protection method that satisfies differential privacy constraint to protect location data privacy and maximizes the utility of data and algorithm in Industrial IoT is proposed.
Abstract: In the research of location privacy protection, the existing methods are mostly based on the traditional anonymization, fuzzy and cryptography technology, and little success in the big data environment, for example, the sensor networks contain sensitive information, which is compulsory to be appropriately protected. Current trends, such as “Industrie 4.0” and Internet of Things (IoT), generate, process, and exchange vast amounts of security-critical and privacy-sensitive data, which makes them attractive targets of attacks. However, previous methods overlooked the privacy protection issue, leading to privacy violation. In this paper, we propose a location privacy protection method that satisfies differential privacy constraint to protect location data privacy and maximizes the utility of data and algorithm in Industrial IoT. In view of the high value and low density of location data, we combine the utility with the privacy and build a multilevel location information tree model. Furthermore, the index mechanism of differential privacy is used to select data according to the tree node accessing frequency. Finally, the Laplace scheme is used to add noises to accessing frequency of the selecting data. As is shown in the theoretical analysis and the experimental results, the proposed strategy can achieve significant improvements in terms of security, privacy, and applicability.

234 citations


Journal ArticleDOI
TL;DR: This work systematize the application areas, enabling technologies, privacy types, attackers, and data sources for the attacks, giving structure to the fuzzy term “smart city.”
Abstract: Many modern cities strive to integrate information technology into every aspect of city life to create so-called smart cities. Smart cities rely on a large number of application areas and technologies to realize complex interactions between citizens, third parties, and city departments. This overwhelming complexity is one reason why holistic privacy protection only rarely enters the picture. A lack of privacy can result in discrimination and social sorting, creating a fundamentally unequal society. To prevent this, we believe that a better understanding of smart cities and their privacy implications is needed. We therefore systematize the application areas, enabling technologies, privacy types, attackers, and data sources for the attacks, giving structure to the fuzzy term “smart city.” Based on our taxonomies, we describe existing privacy-enhancing technologies, review the state of the art in real cities around the world, and discuss promising future research directions. Our survey can serve as a reference guide, contributing to the development of privacy-friendly smart cities.

189 citations


Journal ArticleDOI
Qian Wang1, Yan Zhang1, Xiao Lu1, Zhibo Wang1, Zhan Qin2, Kui Ren2 
TL;DR: Experimental results show that the proposed schemes outperform the existing methods and improve the utility of real-time data sharing with strong privacy guarantee.
Abstract: Nowadays gigantic crowd-sourced data from mobile devices have become widely available in social networks, enabling the possibility of many important data mining applications to improve the quality of our daily lives. While providing tremendous benefits, the release of crowd-sourced social network data to the public will pose considerable threats to mobile users’ privacy. In this paper, we investigate the problem of real-time spatio-temporal data publishing in social networks with privacy preservation. Specifically, we consider continuous publication of population statistics and design RescueDP—an online aggregate monitoring framework over infinite streams with $w$ -event privacy guarantee. Its key components including adaptive sampling, adaptive budget allocation, dynamic grouping, perturbation and filtering, are seamlessly integrated as a whole to provide privacy-preserving statistics publishing on infinite time stamps. Moreover, we further propose an enhanced RescueDP with neural networks to accurately predict the values of statistics and improve the utility of released data. Both RescueDP and the enhanced RescueDP are proved satisfying $w$ -event privacy. We evaluate the proposed schemes with real-world as well as synthetic datasets and compare them with two $w$ -event privacy-assured representative methods. Experimental results show that the proposed schemes outperform the existing methods and improve the utility of real-time data sharing with strong privacy guarantee.

148 citations


Journal ArticleDOI
TL;DR: It is shown how a theoretical model of the factors that influence developers’ privacy practices can be conceptualized and used as a guide for future research toward effective implementation of PbD.
Abstract: Privacy by design (PbD) is a policy measure that guides software developers to apply inherent solutions to achieve better privacy protection. For PbD to be a viable option, it is important to understand developers' perceptions, interpretation and practices as to informational privacy (or data protection). To this end, we conducted in-depth interviews with 27 developers from different domains, who practice software design. Grounded analysis of the data revealed an interplay between several different forces affecting the way in which developers handle privacy concerns. Borrowing the schema of Social Cognitive Theory (SCT), we classified and analyzed the cognitive, organizational and behavioral factors that play a role in developers' privacy decision making. Our findings indicate that developers use the vocabulary of data security to approach privacy challenges, and that this vocabulary limits their perceptions of privacy mainly to third-party threats coming from outside of the organization; that organizational privacy climate is a powerful means for organizations to guide developers toward particular practices of privacy; and that software architectural patterns frame privacy solutions that are used throughout the development process, possibly explaining developers' preference of policy-based solutions to architectural solutions. Further, we show, through the use of the SCT schema for framing the findings of this study, how a theoretical model of the factors that influence developers' privacy practices can be conceptualized and used as a guide for future research toward effective implementation of PbD.

139 citations


Journal ArticleDOI
TL;DR: Simulation demonstrates the "superiority" of the P2P approach to privacy, as demonstrated by KeywoRDS Anonymity, Bloom, Cache, Dummies, LBS, Obfuscation, P1P, Privacy, Security, TTP
Abstract: Location Based Services (LBS) expose user data to malicious attacks. Approaches, evolved, so far, for preserving privacy and security, suffer from one or more anomalies, and hence the problem of securing LBS data is far from being resolved. In particular, accuracy of results vs. privacy degree, privacy vs. performance, and trust between users are open problems. In this article, we present a novel approach by integration of peer-to-peer (P2P) with the caching technique and dummies from real queries. Our approach increases efficiency, leads to improved performance, and provides solutions to many problems that have existed in the past. In addition, we offer an improved way of managing cache. Simulation demonstrates superiority of our approach over earlier ones dealing with both the ratio of privacy and that of performance.

84 citations


Journal ArticleDOI
TL;DR: This paper proposes a masking approach for spatio-temporal aggregation of time series for protecting individual privacy while still providing sufficient error-resilience and reliability.
Abstract: The deployment of future energy systems promises a number of advantages for a more stable and reliable grid as well as for a sustainable usage of energy resources. The efficiency and effectiveness of such smart grids rely on customer consumption data that is collected, processed, and analyzed. This data is used for billing, monitoring, and prediction. However, this implies privacy threats. Approaches exist that aim to either encrypt data in certain ways, to reduce the resolution of data or to mask data in a way so that an individuals’ contribution is untraceable. While the latter is an effective way for protecting customer privacy when aggregating over space or time, one of the drawbacks of these approaches is the limitation or full negligence of device failures. In this paper, we therefore propose a masking approach for spatio-temporal aggregation of time series for protecting individual privacy while still providing sufficient error-resilience and reliability.

61 citations


Patent
29 Jan 2018
TL;DR: In this article, the authors present a system that collects and/or uses personal data, and then automatically analyzes the computer code to identify one or more privacy-related attributes that may impact privacy assessment standards.
Abstract: Data processing systems and methods, according to various embodiments, perform privacy assessments and monitor new versions of computer code for updated features and conditions that relate to compliance with privacy standards. The systems and methods may obtain a copy of computer code (e.g., a software application or a website) that collects and/or uses personal data, and then automatically analyzes the computer code to identify one or more privacy-related attributes that may impact privacy assessment standards. In various embodiments, the system is adapted to monitor one or more locations (e.g., an online software application marketplace, and/or a specified website) to determine whether the application or website has changed. The system may, after analyzing the computer code, display the privacy-related attributes, collect information regarding the attributes, and automatically notify one or more designated individuals (e.g., privacy office representatives) regarding the attributes and information collected.

58 citations


Journal ArticleDOI
TL;DR: Analysis of discussions about privacy on two major developer forums for iOS and Android finds that the different platforms produce markedly different definitions of privacy, illustrating the role of platforms not only as intermediaries for privacy-sensitive content but also as regulators who help define what privacy is and how it works.
Abstract: Mobile application design can have a tremendous impact on consumer privacy. But how do mobile developers learn what constitutes privacy? We analyze discussions about privacy on two major developer forums: one for iOS and one for Android. We find that the different platforms produce markedly different definitions of privacy. For iOS developers, Apple is a gatekeeper, controlling market access. The meaning of “privacy” shifts as developers try to interpret Apple’s policy guidance. For Android developers, Google is one data-collecting adversary among many. Privacy becomes a set of defensive features through which developers respond to a data-driven economy’s unequal distribution of power. By focusing on the development cultures arising from each platform, we highlight the power differentials inherent in “privacy by design” approaches, illustrating the role of platforms not only as intermediaries for privacy-sensitive content but also as regulators who help define what privacy is and how it works.

53 citations


Book ChapterDOI
TL;DR: This book chapter thoroughly investigates current security and privacy preservation solutions that exist in this area, with an eye on the Industrial Internet of Things, and proposes future directions.
Abstract: The synergy between Cloud and IoT has emerged largely due to the Cloud having attributes which directly benefit IoT and enable its continued growth. IoT adopting Cloud services has brought new security challenges. In this book chapter, we pursue two main goals: (1) to analyse the different components of Cloud computing and IoT and (2) to present security and privacy problems that these systems face. We thoroughly investigate current security and privacy preservation solutions that exist in this area, with an eye on the Industrial Internet of Things, discuss open issues and propose future directions.

52 citations



Journal ArticleDOI
TL;DR: Study findings suggest that attitudinal measures were stronger predictors of privacy behaviors than were social locators and support was found for a model positing that if an individual placed a higher premium on their personal, private information they would be less inclined to disclose such information while visiting online social networking sites.
Abstract: Online social networks are designed to encourage disclosure while also having the ability to disrupt existing privacy boundaries. This study assesses those individuals who are the most active online: “Digital Natives.” The specific focus includes participants’ privacy beliefs; how valuable they believe their personal, private information to be; and what risks they perceive in terms of disclosing this information in a fairly anonymous online setting. A model incorporating these concepts was tested in the context of communication privacy management theory. Study findings suggest that attitudinal measures were stronger predictors of privacy behaviors than were social locators. In particular, support was found for a model positing that if an individual placed a higher premium on their personal, private information, they would then be less inclined to disclose such information while visiting online social networking sites.

Journal ArticleDOI
TL;DR: This work proposes a data partition technique, named extended quasi-identifier-partitioning (EQI- partitioning), which disassociates record terms that participate in identifying combinations, and proves the privacy guarantee of this mechanism.
Abstract: Storage as a service has become an important paradigm in cloud computing for its great flexibility and economic savings. However, the development is hampered by data privacy concerns: data owners no longer physically possess the storage of their data. In this work, we study the issue of privacy-preserving set-valued data publishing. Existing data privacy-preserving techniques (such as encryption, suppression, generalization) are not applicable in many real scenes, since they would incur large overhead for data query or high information loss. Motivated by this observation, we present a suite of new techniques that make privacy-aware set-valued data publishing feasible on hybrid cloud. On data publishing phase, we propose a data partition technique, named extended quasi-identifier-partitioning (EQI-partitioning), which disassociates record terms that participate in identifying combinations. This way the cloud server cannot associate with high probability a record with rare term combinations. We prove the privacy guarantee of our mechanism. On data querying phase, we adopt interactive differential privacy strategy to resist privacy breaches from statistical queries. We finally evaluate its performance using real-life data sets on our cloud test-bed. Our extensive experiments demonstrate the validity and practicality of the proposed scheme.

Journal ArticleDOI
TL;DR: A new location privacy technique called the enhanced semantic obfuscation technique (ESOT) is proposed to preserve the location information of a user and experimental results show that ESOT achieves improved location privacy and service utility when compared with a well-known existing approach, the semantic obfuscations technique.
Abstract: The Internet of Things (IoT) means connecting everything with every other thing through the Internet. In IoT, millions of devices communicate to exchange data and information with each other. During communication, security and privacy issues arise which need to be addressed. To protect information about users’ location, an efficient technique should be devised. Several techniques have already been proposed for preserving location privacy in IoT. However, the existing research lags in preserving location privacy in IoT and has highlighted several issues such as being specific or being restricted to a certain location. In this paper, we propose a new location privacy technique called the enhanced semantic obfuscation technique (ESOT) to preserve the location information of a user. Experimental results show that ESOT achieves improved location privacy and service utility when compared with a well-known existing approach, the semantic obfuscation technique.

Journal ArticleDOI
TL;DR: The work introduces assurance as evidence for satisfying the security and privacy requirements in terms of completeness and reportable of security incident through audit, and allows perspective cloud users to define their assurance requirements so that appropriate cloud models can be selected for a given context.
Abstract: Despite of the several benefits of migrating enterprise critical assets to the cloud, there are challenges specifically related to security and privacy. It is important that cloud users understand their security and privacy needs, based on their specific context and select cloud model best fit to support these needs. The literature provides works that focus on discussing security and privacy issues for cloud systems but such works do not provide a detailed methodological approach to elicit security and privacy requirements neither methods to select cloud deployment models based on satisfaction of these requirements by cloud service providers. This work advances the current state of the art towards this direction. In particular, we consider requirements engineering concepts to elicit and analyze security and privacy requirements and their associated mechanisms using a conceptual framework and a systematic process. The work introduces assurance as evidence for satisfying the security and privacy requirements in terms of completeness and reportable of security incident through audit. This allows perspective cloud users to define their assurance requirements so that appropriate cloud models can be selected for a given context. To demonstrate our work, we present results from a real case study based on the Greek National Gazette.

Journal ArticleDOI
TL;DR: This paper presents a client-based framework for user privacy protection, which requires not only no change to existing recommendation algorithms, but also no compromise to the recommendation accuracy, and introduces a privacy protection model.
Abstract: Personalized recommendation has demonstrated its effectiveness in improving the problem of information overload on the Internet. However, evidences show that due to the concerns of personal privacy, users’ reluctance to disclose their personal information has become a major barrier for the development of personalized recommendation. In this paper, we propose to generate a group of fake preference profiles, so as to cover up the user sensitive subjects, and thus protect user personal privacy in personalized recommendation. First, we present a client-based framework for user privacy protection, which requires not only no change to existing recommendation algorithms, but also no compromise to the recommendation accuracy. Second, based on the framework, we introduce a privacy protection model, which formulates the two requirements that ideal fake preference profiles should satisfy: (1) the similarity of feature distribution, which measures the effectiveness of fake preference profiles to hide a genuine user preference profile; and (2) the exposure degree of sensitive subjects, which measures the effectiveness of fake preference profiles to cover up the sensitive subjects. Finally, based on a subject repository of product classification, we present an implementation algorithm to well meet the privacy protection model. Both theoretical analysis and experimental evaluation demonstrate the effectiveness of our proposed approach.

Journal ArticleDOI
TL;DR: The concept of a trusted smart meter is proposed in this paper in order to protect consumers’ private information from the terminal end and ensures trusted detection for upgrading the software of smart meters.
Abstract: The emergence of smart grids has considerably improved the quality of life. However, the real-time electric usage information transmitted by smart meters reveals consumers’ private information. Based on the remote anonymous attestation technology in trusted computing, the concept of a trusted smart meter is proposed in this paper in order to protect consumers’ private information from the terminal end. Attribute certificates are used to hide the platform configuration of smart meters and the ring signature technology is adopted to hide users’ personal information. The proposed protocol not only prevents the leakage of private information with competitive efficiency but also ensures trusted detection for upgrading the software of smart meters.

Journal ArticleDOI
TL;DR: An unobtrusive recommendation system which can crowdsource users’ privacy permission settings and generate the recommendations for them accordingly, and allows users to provide feedback to revise theRecommendations for getting better performance and adapting different scenarios.
Abstract: People nowadays almost want everything at their fingertips, from business to entertainment, and meanwhile they do not want to leak their sensitive data. Strong information protection can be a competitive advantage, but preserving privacy is a real challenge when people use the mobile apps in the smartphone. If they are too lax with privacy preserving, important or sensitive information could be lost. If they are too tight with privacy, making users jump through endless hoops to access the data they need to get their work done, productivity can nosedive. Thus, striking a balance between privacy and usability in mobile applications can be difficult. Leveraging the privacy permission settings in mobile operating systems, our basic idea to address this issue is to provide proper recommendations about the settings so that the users can preserve their sensitive information and maintain the usability of apps. In this paper, we propose an unobtrusive recommendation system to implement this idea, which can crowdsource users’ privacy permission settings and generate the recommendations for them accordingly. Besides, our system allows users to provide feedback to revise the recommendations for getting better performance and adapting different scenarios. For the evaluation, we collected users’ preferences from 382 participants on Amazon Technical Turks and released our system to users in the real world for 10 days. According to the study, our system can make appropriate recommendations which can meet participants’ privacy expectation and mobile apps’ usability.

Journal ArticleDOI
TL;DR: It is proved that the community medical data can be effectively protected through theoretical analysis and simulation experiments, and an infrastructure framework for privacy protection of community medical Internet of things is proposed.
Abstract: As a kind of medical service around people, community health care is closely related to peoples lives, and thus it has also been placed higher requirements. In the face of growing community medical needs, the construction and development of community medical Internet of things is imminent. Subsequently, massive multi-type of medical data which contain all kinds of user identity data, various types of vital signs data and other sensitive information are generated. Such a large scale of data in the transmission, storage and access process is facing the risk of data leakage. To effectively protect the privacy information of patients, an infrastructure framework for privacy protection of community medical Internet of things is proposed. It includes transmission protection based on multi-path asymmetric encryption fragment transmission mechanism, storage protection using distributed symmetric encryption cloud storage scheme and access control with identity authentication and dynamic access authorization. Through theoretical analysis and simulation experiments, it is proved that the community medical data can be effectively protected.

Journal ArticleDOI
TL;DR: This paper proposes a client-based approach to protect personal privacy in a CloudDB, where privacy data before being stored into the cloud side, would be encrypted using a traditional encryption algorithm, so as to ensure the security of privacy data.
Abstract: Due to the advantages of pay-on-demand, expand-on-demand and high availability, cloud databases (CloudDB) have been widely used in information systems. However, since a CloudDB is distributed on an untrusted cloud side, it is an important problem how to effectively protect massive private information in the CloudDB. Although traditional security strategies (such as identity authentication and access control) can prevent illegal users from accessing unauthorized data, they cannot prevent internal users at the cloud side from accessing and exposing personal privacy information. In this paper, we propose a client-based approach to protect personal privacy in a CloudDB. In the approach, privacy data before being stored into the cloud side, would be encrypted using a traditional encryption algorithm, so as to ensure the security of privacy data. To execute various kinds of query operations over the encrypted data efficiently, the encrypted data would be also augmented with additional feature index, so that as much of each query operation as possible can be processed on the cloud side without the need to decrypt the data. To this end, we explore how the feature index of privacy data is constructed, and how a query operation over privacy data is transformed into a new query operation over the index data so that it can be executed on the cloud side correctly. The effectiveness of the approach is demonstrated by theoretical analysis and experimental evaluation. The results show that the approach has good performance in terms of security, usability and efficiency, thus effective to protect personal privacy in the CloudDB.

Journal ArticleDOI
TL;DR: A parametric solution to the problem of optimal exchange of privacy for money is found, and a closed-form expression is obtained and the trade-off between profile-disclosure risk and economic reward is characterized for several interesting cases.

Journal ArticleDOI
TL;DR: This paper proposes a distributed and multi-unit privacy guaranteeing bidding mechanism as part of a DR program without relying on any third party, trusted or not, to protect the participants’ bidding information, except obviously for the winning price and the winner exposed to the utility.
Abstract: The stringent requirement of the demand-supply equilibrium for delivering electricity has traditionally been dealt with a supply-side perspective, assuming that the demand is not alterable. With the promises of the smart grid, demand-side management techniques are increasingly becoming more feasible. A demand-side management technique, called demand response (DR), aims at inducing changes in electricity load in response to financial incentives, some of which involve bidding as the underlying facilitator. It is well-established that the effectiveness of the DR is proportional to the number of participants. Yet, many of the DR programs, including those involving bidding, may suffer due to consumer privacy concerns. Within this context, in this paper, we propose a distributed and multi-unit privacy guaranteeing bidding mechanism as part of a DR program without relying on any third party, trusted or not, to protect the participants’ bidding information, except obviously for the winning price and the winner exposed to the utility. To the best of our knowledge, this is the first such approach for the DR bidding programs. We provide a security analysis of our approach under the honest-but-curious and active adversary assumptions and prove the privacy assuring property.

Book
26 Jul 2018
TL;DR: This thesis proposes lightweight secure and privacy-preserving V2G connection scheme, in which the power grid assures the confidentiality and integrity of exchanged information during (dis)charging electricity sessions and overcomes EVs’ authentication problem.
Abstract: Smart grid utilizes different communication technologies to enhance the reliability and efficiency of the power grid; it allows bi-directional flow of electricity and information, about grid status and customers requirements, among different parties in the grid, i.e., connect generation, distribution, transmission, and consumption subsystems together. Thus, smart grid reduces the power losses and increases the efficiency of electricity generation and distribution. Although smart grid improves the quality of grid’s services, it exposes the grid to the cyber security threats that communication networks suffer from in addition to other novel threats because of power grid’s nature. For instance, the electricity consumption messages sent from consumers to the utility company via wireless network may be captured, modified, or replayed by adversaries. As a consequent, security and privacy concerns are significant challenges in smart grid. Smart grid upgrade creates three main communication architectures: The first one is the communication between electricity customers and utility companies via various networks; i.e., home area networks (HANs), building area networks (BANs), and neighbour area networks (NANs), we refer to these networks as customer-side networks in our thesis. The second architecture is the communication between EVs and grid to charge/discharge their batteries via vehicle-to-grid (V2G) connection. The last network is the grid’s connection with measurements units that spread all over the grid to monitor its status and send periodic reports to the main control center (CC) for state estimation and bad data detection purposes. This thesis addresses the security concerns for the three communication architectures. For customer-side networks, the privacy of consumers is the central concern for these networks; also, the transmitted messages integrity and confidentiality should be guaranteed. While the main security concerns for V2G networks are the privacy of vehicle’s owners besides the authenticity of participated parties. In the grid’s connection with measurements units, integrity attacks, such as false data injection (FDI) attacks, target the measurements’ integrity and consequently mislead the main CC to make the wrong decisions for the grid. The thesis presents two solutions for the security problems in the first architecture; i.e., the customer-side networks. The first proposed solution is security and privacy-preserving scheme in BAN, which is a cluster of HANs. The proposed scheme is based on forecasting the future electricity demand for the whole BAN cluster. Thus, BAN connects to the electricity provider only if the total demand of the cluster is changed. The proposed scheme employs the lattice-based public key NTRU crypto-system to guarantee the confidentiality and authenticity of the exchanged messages and to further reduce the computation and communication load. The security analysis shows that our proposed scheme can achieve the privacy and security requirements. In addition, it efficiently reduces the communication and computation overhead. According to the second solution, it is lightweight privacy-preserving aggregation scheme that permits the smart household appliances to aggregate their readings without involving the connected smart meter. The scheme deploys a lightweight lattice-based homomorphic cryptosystem that depends on simple addition and multiplication operations. Therefore, the proposed scheme guarantees the customers’ privacy and message integrity with lightweight overhead. In addition, the thesis proposes lightweight secure and privacy-preserving V2G connection scheme, in which the power grid assures the confidentiality and integrity of exchanged information during (dis)charging electricity sessions and overcomes EVs’ authentication problem. The proposed scheme guarantees the financial profits of the grid and prevents EVs from acting maliciously. Meanwhile, EVs preserve their private information by generating their own

Book ChapterDOI
01 Jan 2018
TL;DR: The guide was the famous Cooley’s classic definition of personal immunity as “a right of complete immunity: to be let alone” [3], which was soon adapted for definition of privacy.
Abstract: We decided to use simpler definitions of security and privacy, boiling down to their most essential characteristics. Our guide was the famous Cooley’s classic definition of personal immunity as “a right of complete immunity: to be let alone” [3]. This phrase was soon adapted for definition of privacy. Being provided by a lawyer, it includes physical aspects of privacy—critical in the real world but not essential in the virtual world; as will be clear from our definitions of security and privacy in the next paragraph, we see these aspects more as security characteristics than privacy characteristics.

Journal ArticleDOI
TL;DR: The proposed framework extends the traditional frame-by-frame evaluation approach by introducing two new approaches based on aggregated and fused frames, which assesses the achieved privacy protection and utility by comparing the performance of standard computer vision tasks on protected and unprotected visual data.
Abstract: Ubiquitous and networked sensors impose a huge challenge for privacy protection which has become an emerging problem of modern society. Protecting the privacy of visual data is particularly important due to the omnipresence of cameras, and various protection mechanisms for captured images and videos have been proposed. This paper introduces an objective evaluation framework in order to assess such protection methods. Visual privacy protection is typically realised by obfuscating sensitive image regions which often results in some loss of utility. Our evaluation framework assesses the achieved privacy protection and utility by comparing the performance of standard computer vision tasks, such as object recognition, detection and tracking on protected and unprotected visual data. The proposed framework extends the traditional frame-by-frame evaluation approach by introducing two new approaches based on aggregated and fused frames. We demonstrate our framework on eight differently protected video-sets and measure the trade-off between the improved privacy protection due to obfuscating captured image data and the degraded utility of the visual data. Results provided by our objective evaluation method are compared with an available state-of-the-art subjective study of these eight protection techniques.

Journal ArticleDOI
TL;DR: This work proposes to use sensitivity analysis to infer whether an app requests sensitive on-device resources/data that are not required for its expected functionality, and addresses challenges in efficiently achieving test coverage and automated privacy risk assessment.
Abstract: Given the emerging concerns over app privacy-related risks, major app distribution providers (eg, Microsoft) have been exploring approaches to help end users to make informed decision before installation This is different from existing approaches of simply trusting users to make the right decision We build on the direction of risk rating as the way to communicate app-specific privacy risks to end users To this end, we propose to use sensitivity analysis to infer whether an app requests sensitive on-device resources/data that are not required for its expected functionality Our system, Privet, addresses challenges in efficiently achieving test coverage and automated privacy risk assessment Finally, we evaluate Privet with 1,000 Android apps released in the wild

Book ChapterDOI
01 Jan 2018
TL;DR: The different types of threads and the technologies used today to break into computer systems and data collections are analyzed and an overview on security measures that are used for risk reduction are given.
Abstract: With the availability of international networks and with the wide distribution of personal computers, the occurrence of security threads and violations became a mass phenomenon. In parallel to the development of the new economies, a shadow industry of criminal organizations appeared. We will analyze the different types of threads and the technologies used today to break into computer systems and data collections and give an overview on security measures that are used for risk reduction.

Journal ArticleDOI
TL;DR: A comprehensive survey is provided that investigates the various location privacy risks and threats that may arise from the different components of this CRN technology, and explores the different privacy attacks and countermeasure solutions that have been proposed in the literature to cope with this location privacy issue.
Abstract: Cognitive radio networks (CRNs) have emerged as an essential technology to enable dynamic and opportunistic spectrum access which aims to exploit underutilized licensed channels to solve the spectrum scarcity problem. Despite the great benefits that CRNs offer in terms of their ability to improve spectrum utilization efficiency, they suffer from user location privacy issues. Knowing that their whereabouts may be exposed can discourage users from joining and participating in the CRNs, thereby potentially hindering the adoption and deployment of this technology in future generation networks. The location information leakage issue in the CRN context has recently started to gain attention from the research community due to its importance, and several research efforts have been made to tackle it. However, to the best of our knowledge, none of these works have tried to identify the vulnerabilities that are behind this issue or discuss the approaches that could be deployed to prevent it. In this paper, we try to fill this gap by providing a comprehensive survey that investigates the various location privacy risks and threats that may arise from the different components of this CRN technology, and explores the different privacy attacks and countermeasure solutions that have been proposed in the literature to cope with this location privacy issue. We also discuss some open research problems, related to this issue, that need to be overcome by the research community to take advantage of the benefits of this key CRN technology without having to sacrifice the users' privacy.

Journal ArticleDOI
TL;DR: A framework DSSE with Forward Privacy (dynamic symmetric searchable encryption with forward privacy), which consists of Internet of Things and Cloud storage, with the attributes of the searchableryption and the privacy preserving are proposed and outperforms other DSSE schemes with respect to both effectiveness and efficiency.
Abstract: Summary The dynamic searchable encryption schemes generate search tokens for the encrypted data on a cloud server periodically or on a demand. With such search tokens, a user can query the encrypted data whiles preserving the data's privacy; ie, the cloud server can retrieve the query results to the user but do not know the content of the encrypted data. A framework DSSE with Forward Privacy (dynamic symmetric searchable encryption [DSSE] with forward privacy), which consists of Internet of Things and Cloud storage, with the attributes of the searchable encryption and the privacy preserving are proposed. Compared with the known DSSE schemes, our approach supports the multiusers query. Furthermore, our approach successfully patched most of the security flaws related to the sensitive information's leakage in the DSSE schemes. Both security analysis and simulations show that our approach outperforms other DSSE schemes with respect to both effectiveness and efficiency.

Journal ArticleDOI
TL;DR: It is argued that in addition to focusing on content, privacy regulators and technology companies must also consider the ways that privacy policy design — the artistic and structural choices that frame and present a company’s privacy terms to the public — can manipulate users into making risky privacy choices.
Abstract: Design configures our relationship with a space, whether offline or online. In particular, the design of built online environments can constrain our ability to understand and respond to websites’ data use practices or it can enhance agency by giving us control over information. Design, therefore, poses dangers and offers opportunity to protect privacy online. This Article is a comprehensive theoretical and empirical approach to the design of privacy policies.Privacy notices today do not convey information in a way understandable to most internet users. This is because they are designed without the needs of real people in mind. They are written by lawyers and for lawyers, and they ignore the way most of us make disclosure decisions online. They also ignore design. This Article argues that in addition to focusing on content, privacy regulators and technology companies must also consider the ways that privacy policy design — the artistic and structural choices that frame and present a company’s privacy terms to the public — can manipulate or coerce users into making risky privacy choices. I present empirical evidence of the designs currently employed by privacy policies and the effect of different designs on user choices. This research shows that supposedly “user-friendly” designs are not always boons to consumers; design strategies can manipulate users into making bad choices just as easily as they can enhance transparency. This suggests that recommending “user-friendly” design is not enough. Rather, privacy regulators, including the Federal Trade Commission and state attorneys general and legislators, must ensure that privacy policies, and the websites that display them, are designed in ways that enhance transparency. And corporations must institutionalize the importance of notice design throughout the organizations.

Journal ArticleDOI
TL;DR: It is shown that privacy-protecting implementation, while typically impossible with normal- form mechanisms, is achievable with extensive-form mechanisms.
Abstract: In most implementation frameworks agents care only about the outcome, and not at all about the way in which it was obtained. Additionally, typical mechanisms for full implementation involve the complete revelation of all private information to the planner. In this paper I consider the problem of full implementation with agents who may prefer to protect their privacy. I analyze the extent to which privacy-protecting mechanisms can be constructed under various assumptions about agents’ predilection for privacy and the permissible game forms.