scispace - formally typeset
Search or ask a question
Author

Jörg Schreck

Bio: Jörg Schreck is an academic researcher from Center for Information Technology. The author has contributed to research in topics: Personally identifiable information & Personalization. The author has an hindex of 2, co-authored 2 publications receiving 204 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: Security requirements to guarantee privacy in user-adaptive systems are discussed and ways to keep users anonymous while fully preserving personalized interaction with them are explored.
Abstract: User-adaptive applications cater to the needs of each individual computer user, taking for example users' interests, level of expertise, preferences, perceptual and motoric abilities, and the usage environment into account. Central user modeling servers collect and process the information about users that different user-adaptive systems require to personalize their user interaction.Adaptive systems are generally better able to cater to users the more data their user modeling systems collect and process about them. They therefore gather as much data as possible and "lay them in stock" for possible future usage. Moreover, data collection usually takes place without users' initiative and sometimes even without their awareness, in order not to cause distraction. Both is in conflict with users' privacy concerns that became manifest in numerous recent consumer polls, and with data protection laws and guidelines that call for parsimony, purpose-orientation, and user notification or user consent when personal data are collected and processed.This article discusses security requirements to guarantee privacy in user-adaptive systems and explores ways to keep users anonymous while fully preserving personalized interaction with them. User anonymization in personalized systems goes beyond current models in that not only users must remain anonymous, but also the user modeling system that maintains their personal data. Moreover, users' trust in anonymity can be expected to lead to more extensive and frank interaction, hence to more and better data about the user, and thus to better personalization. A reference model for pseudonymous and secure user modeling is presented that meets many of the proposed requirements.

168 citations

Book ChapterDOI
27 May 1997
TL;DR: The detected differing needs of AVANTI users are described, the kind of adaptations that are currently implemented to cater to these needs, and the system architecture that enables AVantI to generate user-adapted web pages from distributed multimedia databases is described.
Abstract: Users of publicly accessible information systems are generally heterogeneous and have different needs. The aim of the AVANTI project is to cater to these individual needs by adapting the user interface and the content and presentation of WWW pages to each individual user. The special needs of elderly and handicapped users are also partly considered. A model of the characteristics of user groups and individual users, a model of the usage characteristics of the system, and a domain model are exploited in the adaptation process. This paper describes the detected differing needs of AVANTI users, the kind of adaptations that are currently implemented to cater to these needs, and the system architecture that enables AVANTI to generate user-adapted web pages from distributed multimedia databases. Special attention is given to privacy and security issues which are crucial when personal information about users is at stake.

41 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: An interdisciplinary review of privacy-related research is provided in order to enable a more cohesive treatment and recommends that researchers be alert to an overarching macro model that is referred to as APCO (Antecedents → Privacy Concerns → Outcomes).
Abstract: To date, many important threads of information privacy research have developed, but these threads have not been woven together into a cohesive fabric. This paper provides an interdisciplinary review of privacy-related research in order to enable a more cohesive treatment. With a sample of 320 privacy articles and 128 books and book sections, we classify previous literature in two ways: (1) using an ethics-based nomenclature of normative, purely descriptive, and empirically descriptive, and (2) based on their level of analysis: individual, group, organizational, and societal. Based upon our analyses via these two classification approaches, we identify three major areas in which previous research contributions reside: the conceptualization of information privacy, the relationship between information privacy and other constructs, and the contextual nature of these relationships. As we consider these major areas, we draw three overarching conclusions. First, there are many theoretical developments in the body of normative and purely descriptive studies that have not been addressed in empirical research on privacy. Rigorous studies that either trace processes associated with, or test implied assertions from, these value-laden arguments could add great value. Second, some of the levels of analysis have received less attention in certain contexts than have others in the research to date. Future empirical studies-both positivist and interpretive--could profitably be targeted to these under-researched levels of analysis. Third, positivist empirical studies will add the greatest value if they focus on antecedents to privacy concerns and on actual outcomes. In that light, we recommend that researchers be alert to an overarching macro model that we term APCO (Antecedents → Privacy Concerns → Outcomes).

1,595 citations

Patent
14 Sep 2010
TL;DR: An improved human user computer interface system, wherein a user characteristic or set of characteristics, such as demographic profile or societal role, is employed to define a scope or domain of operation, is proposed in this article, where user privacy and anonymity is maintained by physical and algorithmic controls over access to the personal profiles, and releasing only aggregate data without personally identifying information or of small groups.
Abstract: An improved human user computer interface system, wherein a user characteristic or set of characteristics, such as demographic profile or societal “role”, is employed to define a scope or domain of operation. The operation itself may be a database search, to interactively define a taxonomic context for the operation, a business negotiation, or other activity. After retrieval of results, a scoring or ranking may be applied according to user define criteria, which are, for example, commensurate with the relevance to the context, but may be, for example, by date, source, or other secondary criteria. A user profile is preferably stored in a computer accessible form, and may be used to provide a history of use, persistent customization, collaborative filtering and demographic information for the user. Advantageously, user privacy and anonymity is maintained by physical and algorithmic controls over access to the personal profiles, and releasing only aggregate data without personally identifying information or of small groups.

1,465 citations

01 Sep 1996
TL;DR: The objectives of the European Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms
Abstract: (1) Whereas the objectives of the Community, as laid down in the Treaty, as amended by the Treaty on European Union, include creating an ever closer union among the peoples of Europe, fostering closer relations between the States belonging to the Community, ensuring economic and social progress by common action to eliminate the barriers which divide Europe, encouraging the constant improvement of the living conditions of its peoples, preserving and strengthening peace and liberty and promoting democracy on the basis of the fundamental rights recognized in the constitution and laws of the Member States and in the European Convention for the Protection of Human Rights and Fundamental Freedoms;

792 citations

Journal ArticleDOI
TL;DR: A review of the development of generic user modeling systems over the past twenty years is given in this article, which describes their purposes, their services within user-adaptive systems, and the different design requirements for research prototypes and commercially deployed servers.
Abstract: The paper reviews the development of generic user modeling systems over the past twenty years. It describes their purposes, their services within user-adaptive systems, and the different design requirements for research prototypes and commercially deployed servers. It discusses the architectures that have been explored so far, namely shell systems that form part of the application, central server systems that communicate with several applications, and possible future user modeling agents that physically follow the user. Several implemented research prototypes and commercial systems are briefly described.

711 citations

Journal ArticleDOI
TL;DR: The paper uses a three-layer model of user privacy concerns to relate them to system operations and examine their effects on user behavior, and develops guidelines for building privacy-friendly systems.
Abstract: In this paper we integrate insights from diverse islands of research on electronic privacy to offer a holistic view of privacy engineering and a systematic structure for the discipline's topics. First we discuss privacy requirements grounded in both historic and contemporary perspectives on privacy. We use a three-layer model of user privacy concerns to relate them to system operations (data transfer, storage and processing) and examine their effects on user behavior. In the second part of the paper we develop guidelines for building privacy-friendly systems. We distinguish two approaches: "privacy-by-policy" and "privacy-by-architecture." The privacy-by-policy approach focuses on the implementation of the notice and choice principles of fair information practices (FIPs), while the privacy-by-architecture approach minimizes the collection of identifiable personal data and emphasizes anonymization and client-side data storage and processing. We discuss both approaches with a view to their technical overlaps and boundaries as well as to economic feasibility. The paper aims to introduce engineers and computer scientists to the privacy research domain and provide concrete guidance on how to design privacy-friendly systems.

420 citations