scispace - formally typeset
Search or ask a question

Showing papers on "Privacy software published in 2004"


Journal ArticleDOI
TL;DR: The results of this study indicate that the second-order IUIPC factor, which consists of three first-order dimensions--namely, collection, control, and awareness--exhibited desirable psychometric properties in the context of online privacy.
Abstract: The lack of consumer confidence in information privacy has been identified as a major problem hampering the growth of e-commerce. Despite the importance of understanding the nature of online consumers' concerns for information privacy, this topic has received little attention in the information systems community. To fill the gap in the literature, this article focuses on three distinct, yet closely related, issues. First, drawing on social contract theory, we offer a theoretical framework on the dimensionality of Internet users' information privacy concerns (IUIPC). Second, we attempt to operationalize the multidimensional notion of IUIPC using a second-order construct, and we develop a scale for it. Third, we propose and test a causal model on the relationship between IUIPC and behavioral intention toward releasing personal information at the request of a marketer. We conducted two separate field surveys and collected data from 742 household respondents in one-on-one, face-to-face interviews. The results of this study indicate that the second-order IUIPC factor, which consists of three first-order dimensions--namely, collection, control, and awareness--exhibited desirable psychometric properties in the context of online privacy. In addition, we found that the causal model centering on IUIPC fits the data satisfactorily and explains a large amount of variance in behavioral intention, suggesting that the proposed model will serve as a useful tool for analyzing online consumers' reactions to various privacy threats on the Internet.

2,597 citations


Journal Article
TL;DR: In this article, the authors argue that public surveillance violates a right to privacy because it violates contextual integrity; as such, it constitutes injustice and even tyranny, and propose a new construct called contextual integrity as an alternative benchmark for privacy.
Abstract: The practices of public surveillance, which include the monitoring of individuals in public through a variety of media (e.g., video, data, online), are among the least understood and controversial challenges to privacy in an age of information technologies. The fragmentary nature of privacy policy in the United States reflects not only the oppositional pulls of diverse vested interests, but also the ambivalence of unsettled intuitions on mundane phenomena such as shopper cards, closed-circuit television, and biometrics. This Article, which extends earlier work on the problem of privacy in public, explains why some of the prominent theoretical approaches to privacy, which were developed over time to meet traditional privacy challenges, yield unsatisfactory conclusions in the case of public surveillance. It posits a new construct, “contextual integrity,” as an alternative benchmark for privacy, to capture the nature of challenges posed by information technologies. Contextual integrity ties adequate protection for privacy to norms of specific contexts, demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it. Building on the idea of “spheres of justice,” developed by political philosopher Michael Walzer, this Article argues that public surveillance violates a right to privacy because it violates contextual integrity; as such, it constitutes injustice and even tyranny.

1,477 citations


Proceedings ArticleDOI
06 Jun 2004
TL;DR: Confab provides basic support for building ubiquitous computing applications, providing a framework as well as several customizable privacy mechanisms that allow application developers and end-users to support a spectrum of trust levels and privacy needs.
Abstract: Privacy is the most often-cited criticism of ubiquitous computing, and may be the greatest barrier to its long-term success. However, developers currently have little support in designing software architectures and in creating interactions that are effective in helping end-users manage their privacy. To address this problem, we present Confab, a toolkit for facilitating the development of privacy-sensitive ubiquitous computing applications. The requirements for Confab were gathered through an analysis of privacy needs for both end-users and application developers. Confab provides basic support for building ubiquitous computing applications, providing a framework as well as several customizable privacy mechanisms. Confab also comes with extensions for managing location privacy. Combined, these features allow application developers and end-users to support a spectrum of trust levels and privacy needs.

663 citations


Proceedings ArticleDOI
14 Mar 2004
TL;DR: A method, called the mix zone, developed to enhance user privacy in location-based services is refined, the mathematical model is improved, and a method of providing feedback to users is developed.
Abstract: Privacy of personal location information is becoming an increasingly important issue. We refine a method, called the mix zone, developed to enhance user privacy in location-based services. We improve the mathematical model, examine and minimise computational complexity and develop a method of providing feedback to users.

540 citations


Journal ArticleDOI
TL;DR: It is found that reading privacy notices is related to concern for privacy, positive perceptions about notice comprehension, and higher levels of trust in the notice, suggesting that effective privacy notices serve an important function in addressing risk issues related to e-commerce.

521 citations


Book ChapterDOI
31 Aug 2004
TL;DR: This paper analyzes the data partitioning (bucketization) technique and algorithmically develops this technique to build privacy-preserving indices on sensitive attributes of a relational table and develops a novel algorithm for achieving the desired balance between privacy and utility of the index.
Abstract: Database outsourcing is an emerging data management paradigm which has the potential to transform the IT operations of corporations. In this paper we address privacy threats in database outsourcing scenarios where trust in the service provider is limited. Specifically, we analyze the data partitioning (bucketization) technique and algorithmically develop this technique to build privacy-preserving indices on sensitive attributes of a relational table. Such indices enable an untrusted server to evaluate obfuscated range queries with minimal information leakage. We analyze the worst-case scenario of inference attacks that can potentially lead to breach of privacy (e.g., estimating the value of a data element within a small error margin) and identify statistical measures of data privacy in the context of these attacks. We also investigate precise privacy guarantees of data partitioning which form the basic building blocks of our index. We then develop a model for the fundamental privacy-utility tradeoff and design a novel algorithm for achieving the desired balance between privacy and utility (accuracy of range query evaluation) of the index.

481 citations


Proceedings ArticleDOI
25 Apr 2004
TL;DR: This paper evaluates the usability of online privacy policies, as well as the practice of posting them, and determines that significant changes need to be made to current practice to meet regulatory and usability requirements.
Abstract: Studies have repeatedly shown that users are increasingly concerned about their privacy when they go online. In response to both public interest and regulatory pressures, privacy policies have become almost ubiquitous. An estimated 77% of websites now post a privacy policy. These policies differ greatly from site to site, and often address issues that are different from those that users care about. They are in most cases the users' only source of information.This paper evaluates the usability of online privacy policies, as well as the practice of posting them. We analyze 64 current privacy policies, their accessibility, writing, content and evolution over time. We examine how well these policies meet user needs and how they can be improved. We determine that significant changes need to be made to current practice to meet regulatory and usability requirements.

403 citations



Journal ArticleDOI
01 Mar 2004
TL;DR: The authors investigate disclosure-control algorithms that hide users' positions in sensitive areas and withhold path information that indicates which areas they have visited.
Abstract: Although some users might willingly subscribe to location-tracking services, few would be comfortable having their location known in all situations. The authors investigate disclosure-control algorithms that hide users' positions in sensitive areas and withhold path information that indicates which areas they have visited.

274 citations


Proceedings ArticleDOI
01 Aug 2004
TL;DR: This paper proposes privacy risk models as a general method for refining privacy from an abstract concept into concrete issues for specific applications and prioritizing those issues.
Abstract: Privacy is a difficult design issue that is becoming increasingly important as we push into ubiquitous computing environments. While there is a fair amount of theoretical work on designing for privacy, there are few practical methods for helping designers create applications that provide end-users with a reasonable level of privacy protection that is commensurate with the domain, with the community of users, and with the risks and benefits to all stakeholders in the intended system. Towards this end, we propose privacy risk models as a general method for refining privacy from an abstract concept into concrete issues for specific applications and prioritizing those issues. In this paper, we introduce a privacy risk model we have developed specifically for ubiquitous computing, and outline two case studies describing our use of this privacy risk model in the design of two ubiquitous computing applications.

274 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine online behaviors that increase or reduce risk of online identity theft and suggest that consumers need to be vigilant of new threats, such as the placement of cookies, hacking into hard drives, intercepting transactions, and observing online behavior via spyware.
Abstract: This article examines online behaviors that increase or reduce risk of online identity theft. The authors report results from three consumer surveys that indicate the propensity to protect oneself from online identity theft varies by population. The authors then examine attitudinal, behavioral, and demographic antecedents that predict the tendency to protect one's privacy and identity online. Implications and suggestions for managers, public policy makers, and consumers related to protecting online privacy and identity theft are provided. ********** Identity theft, defined as the appropriation of someone else's personal or financial identity to commit fraud or theft, is one of the fastest growing crimes in the United States (Federal Trade Commission 2001) and is increasingly affecting consumers' online transactions. In the discussion of identity theft, the Internet represents an important research context. Because of its ability to accumulate and disseminate vast amounts of information electronically, the Internet may make theft of personal or financial identity easier. Indeed, online transactions pose several new threats that consumers need to be vigilant of, such as the placement of cookies, hacking into hard drives, intercepting transactions, and observing online behavior via spyware (Cohen 2001). Online identity theft through the use of computers does not necessarily have real space analogs as exemplifed by techniques of IP spoofing and page jacking (Katyal 2001). Recent instances of online identity theft appearing in the popular press include a teenager who used e-mail and a bogus Web page to gain access to individuals' credit card data and steal thousands of dollars from consumers (New York Times 2003), and cyber-thieves who were able to access tens of thousands of personal credit reports online (Salkever 2002). The purpose of this article, as depicted in Figure 1, is to explore the extent to which consumers are controlling their information online and whether privacy attitudes, offline data behaviors, online experience and consumer background predict the level of online protection practiced. There is an explicit link being made by privacy advocates that suggests controlling one's information is a step toward protecting oneself from identity theft (Cohen 2001; Federal Trade Commission 2001). To evaluate the level of customer protection, we analyze survey results of consumer online behaviors, many of which are depicted in Figure 1, and investigate their relationship to antecedent conditions suggested in the literature. [FIGURE 1 OMITTED] In particular, we address the following research questions: What is the relationship between offline data protection practices and online protection behavior? What is the relationship between online shopping behaviors and online protection behavior? What is the relationship between privacy attitudes and online protection behavior? What is the relationship between demographics and online protection behavior? The remainder of this article is organized in four sections. We begin in the first section by reviewing the risks consumers face online and the steps they can take to minimize their risk of privacy invasion and identity theft. In the second section, we describe three surveys of consumers' online behaviors related to online privacy and identity theft. We discuss the results in the third section and implications for managers, public policy makers, and consumers in the fourth and final section. ONLINE PRIVACY AND IDENTITY THEFT While identity theft has caught the government's, businesses', and the public's attention (Hemphill 2001; Milne 2003), the empirical scholarly literature in this area is limited to the closely related issue of online privacy. Research has measured consumers' concern for online privacy (Sheehan and Hoy 2000), their ability to opt out of online relationships (Milne and Rohm 2000), and the extent to which businesses have implemented fair information practices through the posting of their online privacy notices (Culnan 2000; Miyazaki and Fernandez 2001; Milne and Culnan 2002). …

Journal ArticleDOI
TL;DR: This paper studies the erosion of privacy when genomic data, either pseudonymous or data believed to be anonymous, are released into a distributed healthcare environment and develops algorithms that link genomic data to named individuals in publicly available records by leveraging unique features in patient-location visit patterns.

Book ChapterDOI
31 Aug 2004
TL;DR: Through a comprehensive set of performance experiments, it is shown that the cost of privacy enforcement is small, and scalable to large databases.
Abstract: We present a practical and efficient approach to incorporating privacy policy enforcement into an existing application and database environment, and we explore some of the semantic tradeoffs introduced by enforcing these privacy policy rules at cell-level granularity. Through a comprehensive set of performance experiments, we show that the cost of privacy enforcement is small, and scalable to large databases.

01 Jan 2004
TL;DR: This book explores the social, political, and legal implications of the collection and use of personal information in computer databases from all angles and recommends how the law can be reformed to simultaneously protect the authors' privacy and allow us to enjoy the benefits of their increasingly digital world.
Abstract: THE DIGITAL PERSON: TECHNOLOGY AND PRIVACY IN THE INFORMATION AGE (ISBN: 0814798462) (NYU Press 2004) explores the social, political, and legal implications of the collection and use of personal information in computer databases. In the Information Age, our lives are documented in digital dossiers maintained by hundreds (perhaps thousands) of businesses and government agencies. These dossiers are composed of bits of our personal information, which when assembled together begin to paint a portrait of our personalities. The dossiers are increasingly used to make decisions about our lives - whether we get a loan, a mortgage, a license, or a job; whether we are investigated or arrested; and whether we are permitted to fly on an airplane. Digital dossiers impact many aspects of our lives. For example, they increase our vulnerability to identity theft, a serious crime that has been escalating at an alarming rate. Moreover, since September 11th, the government has been tapping into vast stores of information collected by businesses and using it to profile people for criminal or terrorist activity. Do these developments pose a problem? Is it possible to protect privacy in a society where information flows so freely and proliferates so rapidly? THE DIGITAL PERSON seeks to answer these questions. This book explores the problem from all angles - how businesses gather personal information in massive databases; how the government increasingly provides this data to businesses through public records; and how the government is gathering personal data from businesses for its own uses. THE DIGITAL PERSON not only explores these problems, but also provides a compelling account of how we can respond to them. Using a wide variety of sources, including history, philosophy, and literature, Solove sets forth a new understanding of privacy, one that is appropriate for the new challenges of the Information Age. Solove recommends how the law can be reformed to simultaneously protect our privacy and allow us to enjoy the benefits of our increasingly digital world. The table of contents and Chapter 1 are available for download.

Proceedings ArticleDOI
10 Oct 2004
TL;DR: A novel way of combining sensor technology with traditional video surveillance in building a privacy protecting framework that exploits the strengths of these modalities and complements their individual limitations is proposed.
Abstract: Around the world as both crime and technology become more prevalent, officials find themselves relying more and more on video surveillance as a cure-all in the name of public safety. Used properly, video cameras help expose wrongdoing but typically come at the cost of privacy to those not involved in any maleficent activity. What if we could design intelligent systems that are more selective in what video they capture, and focus on anomalous events while protecting the privacy of authorized personnel? This paper proposes a novel way of combining sensor technology with traditional video surveillance in building a privacy protecting framework that exploits the strengths of these modalities and complements their individual limitations. Our fully functional system utilizes off the shelf sensor hardware (i.e. RFID, motion detection) for localization, and combines this with a XML-based policy framework for access control to determine violations within the space. This information is fused with video surveillance streams in order to make decisions about how to display the individuals being surveilled. To achieve this, we have implemented several video masking techniques that correspond to varying user privacy levels. These results were achievable in real-time at acceptable frame rates, while meeting our requirements for privacy preservation.

Journal Article
TL;DR: Managerial implications include the careful selection of communication channels for maximum impact, the maintenance of discrete “permission-based” contact with consumers, and accurate recording and handling of data.
Abstract: Many organizations now emphasize the use of technology that can help them get closer to consumers and build ongoing relationships with them. The ability to compile consumer data profiles has been made even easier with Internet technology. However, it is often assumed that consumers like to believe they can trust a company with their personal details. Lack of trust may cause consumers to have privacy concerns. Addressing such privacy concerns may therefore be crucial to creating stable and ultimately profitable customer relationships. Three specific privacy concerns that have been frequently identified as being of importance to consumers include unauthorized secondary use of data, invasion of privacy, and errors. Results of a survey study indicate that both errors and invasion of privacy have a significant inverse relationship with online purchase behavior. Unauthorized use of secondary data appears to have little impact. Managerial implications include the careful selection of communication channels for maximum impact, the maintenance of discrete “permission-based” contact with consumers, and accurate recording and handling of data.

Proceedings ArticleDOI
22 Aug 2004
TL;DR: This paper explores the issue of privacy breach in data mining by developing a framework under which this question can be addressed, and proposes metrics, along with analysis that those metrics are consistent in the face of apparent problems.
Abstract: Privacy-preserving data mining has concentrated on obtaining valid results when the input data is private. An extreme example is Secure Multiparty Computation-based methods, where only the results are revealed. However, this still leaves a potential privacy breach: Do the results themselves violate privacy? This paper explores this issue, developing a framework under which this question can be addressed. Metrics are proposed, along with analysis that those metrics are consistent in the face of apparent problems.

Journal ArticleDOI
01 Nov 2004
TL;DR: Data obfuscation addresses the dilemma of data privacy and data sharing conflict by extending several existing technologies and defining obfuscation properties that quantify the technologies' usefulness and privacy preservation.
Abstract: In some domains, the need for data privacy and data sharing conflict. Data obfuscation addresses this dilemma by extending several existing technologies and defining obfuscation properties that quantify the technologies' usefulness and privacy preservation.

Journal ArticleDOI
01 Nov 2004
TL;DR: This article shows how technology from the security community can change data mining for the better, providing all its benefits while still maintaining privacy.
Abstract: Data mining is under attack from privacy advocates because of a misunderstanding about what it actually is and a valid concern about how it is generally done. This article shows how technology from the security community can change data mining for the better, providing all its benefits while still maintaining privacy.

Journal ArticleDOI
TL;DR: A taxonomy of privacy requirements for Web sites is presented, using goal-mining, the extraction of pre-requ requirements goals from post-requirements text artefacts, to develop the taxonomy that can be used by Web site designers to reduce Web site privacy vulnerabilities and ensure that their stated and actual policies are consistent with each other.
Abstract: The increasing use of personal information on Web-based applications can result in unexpected disclosures. Consumers often have only the stated Web site policies as a guide to how their information is used, and thus on which to base their browsing and transaction decisions. However, each policy is different, and it is difficult—if not impossible—for the average user to compare and comprehend these policies. This paper presents a taxonomy of privacy requirements for Web sites. Using goal-mining, the extraction of pre-requirements goals from post-requirements text artefacts, we analysed an initial set of Internet privacy policies to develop the taxonomy. This taxonomy was then validated during a second goal extraction exercise, involving privacy policies from a range of health care related Web sites. This validation effort enabled further refinement to the taxonomy, culminating in two classes of privacy requirements: protection goals and vulnerabilities. Protection goals express the desired protection of consumer privacy rights, whereas vulnerabilities describe requirements that potentially threaten consumer privacy. The identified taxonomy categories are useful for analysing implicit internal conflicts within privacy policies, the corresponding Web sites, and their manner of operation. These categories can be used by Web site designers to reduce Web site privacy vulnerabilities and ensure that their stated and actual policies are consistent with each other. The same categories can be used by customers to evaluate and understand policies and their limitations. Additionally, the policies have potential use by third-party evaluators of site policies and conflicts.

Proceedings ArticleDOI
05 Jan 2004
TL;DR: Thirteen specific privacy issues are enumerated and discussed as examples of the challenges the authors face as these technologies and their associated products and services are deployed.
Abstract: Location awareness, the ability to determine geographical position, is an emerging technology with both significant benefits and important privacy implications for users of mobile devices such as cell phones and PDAs. Location is determined either internally by a device or externally by systems and networks with which the device interacts, and the resultant location information may be stored, used, and disclosed under various conditions that are described. Thirteen specific privacy issues are enumerated and discussed as examples of the challenges we face as these technologies and their associated products and services are deployed. Regulation by governments, standards organizations, industry groups, public interest groups, and marketplace forces are discussed as it may help address privacy issues.

Patent
24 Aug 2004
TL;DR: In this paper, a system and method of providing privacy through anonymity is described, where a person registers at a privacy server and is given a pseudo identity that can be used to browse, register, purchase, pay for, and take delivery of products and services.
Abstract: A system and method of providing privacy through anonymity is described. As one aspect of the invention, a person registers at a privacy server and is given a pseudo identity that can be used to browse, register, purchase, pay for, and take delivery of products and services. Transactions are completed with the privacy server on a need-to-know basis. A seller communicates with the privacy server but only sees a demand, not the identity of the buyer. The financial institution communicates with the privacy server and sees the payment, not the merchandise. The freight company communicates with the privacy server and sees the package, not its contents. The privacy server operates in a manner that assures privacy and anonymity for the buyer and, if necessary, both the seller as well.

Journal ArticleDOI
01 Nov 2004
TL;DR: This paper argues for what it calls labeling protocols, technical mechanisms through which users can be informed of data requests and their consequences, and examines the P3P lessons and open issues with an eye to pervasive requirements.
Abstract: In pervasive environments, privacy is likely to be a major issue for users, and users will want to be notified of potential data capture. To provide notice to users, this paper argues for what it calls labeling protocols, technical mechanisms through which users can be informed of data requests and their consequences. Recent experiences with the Platform for Privacy Preferences Project (P3P), an attempt to provide privacy mechanisms for the Web, suggest important lessons for the design of a next generation labeling protocol that will be usable and useful in pervasive environments. This paper examines the P3P lessons and open issues with an eye to pervasive requirements.

Book ChapterDOI
29 Mar 2004
TL;DR: It is demonstrated how transactions made under different pseudonyms can be linked and careful disclosure of such links fulfils this right trade-off between trust and privacy by ensuring minimal trade of privacy for the required trust.
Abstract: Both privacy and trust relate to knowledge about an entity. However, there is an inherent conflict between trust and privacy: the more knowledge a first entity knows about a second entity, the more accurate should be the trustworthiness assessment; the more knowledge is known about this second entity, the less privacy is left to this entity. This conflict needs to be addressed because both trust and privacy are essential elements for a smart working world. The solution should allow the benefit of adjunct trust when entities interact without too much privacy loss. We propose to achieve the right trade-off between trust and privacy by ensuring minimal trade of privacy for the required trust. We demonstrate how transactions made under different pseudonyms can be linked and careful disclosure of such links fulfils this right trade-off.

Proceedings Article
01 Jan 2004
TL;DR: The results indicated that the technological assurance mechanism played the most important role in assuring consumers’ perceived control over personal information and benefits the privacy and human-computer interaction research in the Information Systems discipline.
Abstract: Location-based services (LBS), enabled by advances in mobile and positioning technologies, have afforded users with a pervasive flexibility to be uniquely addressable and to access network and services on-the-move. However, because LBS could also associate the lifestyle habits, behaviors, and movements with a consumer’s personal identity, privacy concerns are particularly salient for LBS. Drawing on psychological control and privacy literature, we designed an experiment study to test the basic proposition that the assurance of consumers’ perceived control over their personal information has a considerable influence on alleviating their privacy concerns. Three different mechanisms of assurance of control—technology, industry self-regulation, and legislation—were manipulated in the experiment, and their effects on consumers’ privacy concerns were examined. The results indicated that the technological assurance mechanism (i.e., mobile device in this study) played the most important role in assuring consumers’ perceived control over personal information. The marriage of the privacy and psychological control literature streams provides a rich understanding of consumers’ privacy reaction to LBS usage and, therefore, benefits the privacy and human-computer interaction (HCI) research in the Information Systems discipline.

Journal ArticleDOI
TL;DR: This work proposes an authorized-anonymous-ID-based scheme, which effectively eliminates the need for a trusted server or administration, and designs an architecture capable of achieving complete personal control over location privacy while maintaining the authentication function required by the administration.
Abstract: How to protect location privacy of mobile users is an important issue in ubiquitous computing. However, location privacy protection is particularly challenging: on one hand, the administration requires all legitimate users to provide identity information in order to grant them permission to use its wireless service; on the other hand, mobile users would prefer not to expose any information that could enable anyone, including the administration, to get some clue regarding their whereabouts; mobile users would like to have complete personal control of their location privacy. To address this issue, we propose an authorized-anonymous-ID-based scheme; this scheme effectively eliminates the need for a trusted server or administration, which is assumed in the previous work. Our key weapon is a cryptographic technique called blind signature, which is used to generate an authorized anonymous ID that replaces the real ID of an authorized mobile device. With authorized anonymous IDs, we design an architecture capable of achieving complete personal control over location privacy while maintaining the authentication function required by the administration.

Proceedings ArticleDOI
14 Mar 2004
TL;DR: A practical algorithm is presented, concentrating on those aspects that make refinement of privacy policies more difficult than, for example refinement for access control policies, such as a more sophisticated treatment of deny rules and a suitable way for dealing with obligations and conditions on context information.
Abstract: Enterprise privacy policies often reflect different legal regulations, promises made to customers, as well as more restrictive enterprise-internal practices. The notion of policy refinement is fundamental for privacy policies, as it allows one to check whether a company's policy fulfills regulations or adheres to standards set by customer organizations, to realize the "sticky policy paradigm" that addresses transferring data from one realm to another in a privacy-preserving way, and much more. Although well-established in theory, the problem of how to efficiently check whether one policy refines another has been left open in the privacy policy literature. We present a practical algorithm for this task, concentrating on those aspects that make refinement of privacy policies more difficult than, for example refinement for access control policies, such as a more sophisticated treatment of deny rules and a suitable way for dealing with obligations and conditions on context information.

Journal ArticleDOI
TL;DR: A functional comparison between current privacy law in the European Union and in the United States is presented, leading to the conclusion that the right to privacy is more strictly protected in the EU than in the U.S.

Proceedings ArticleDOI
15 Oct 2004
TL;DR: The architecture of a test-bed under development for secure sharing, capture, distributed processing, and archiving of surveillance data called the Networked Sensor Tapestry (NeST) is detailed, consisting of core software modules including a centralized server, client interface library, a layered XML messaging scheme.
Abstract: This paper details the architecture of a test-bed under development for secure sharing, capture, distributed processing, and archiving of surveillance data called the Networked Sensor Tapestry (NeST). The test-bed consists of core software modules including a centralized server, client interface library, a layered XML messaging scheme. Mobile hardware clients are interfaced to the NeST using a Tiny-OS based microcontroller with sensor data collected over a 1-wire data bus. Maintaining subject privacy in video and other sensor monitoring scenarios can be imperative for the successful deployment of surveillance networks. Subject privacy is integrated into the architecture and can (if desired) operate as a buffer to the server core, denying access to identity specific information to any or all modules or operators. We introduce 3 fundamental privacy concepts: The privacy buffer: is a core component of the NeST server and utilizes programmable plug-in privacy filters operating on incoming sensor data to prevent access to or transform data to remove personally identifiable information. These privacy filters are developed and specified using a privacy grammar that can connect multiple low-level data filters and features to create arbitrary data-dependent privacy definitions. The utility of the architecture is demonstrated with a connection to a variety of hardware/software clients including PDA based client hardware, remote sensor interface devices, software modules for sensor data inferencing, data visualization, sensor control and data archival applications.

Book ChapterDOI
TL;DR: A methodology for identifying, assessing, and comparing location privacy risks in mobile computing technologies is presented and it is argued that these are best addressed through novel anonymity-based mechanisms.
Abstract: Mobile computing enables users to compute and communicate almost regardless of their current location. However, as a side effect this technology considerably increased surveillance potential for user movements. Current research addresses location privacy rather patchwork-like than comprehensively. Thus, this paper presents a methodology for identifying, assessing, and comparing location privacy risks in mobile computing technologies. In a case study, we apply the approach to IEEE 802.11b wireless LAN networks and location-based services, where it reveals significant location privacy concerns through link- and application-layer information. From a technological perspective, we argue that these are best addressed through novel anonymity-based mechanisms.