Author
Thomas Marquenie
Bio: Thomas Marquenie is an academic researcher from Katholieke Universiteit Leuven. The author has contributed to research in topic(s): Data Protection Act 1998 & Directive on Privacy and Electronic Communications. The author has an hindex of 2, co-authored 2 publication(s) receiving 14 citation(s).
Papers
More filters
[...]
TL;DR: While a considerable improvement and major step forward for the protection of personal data in its field, the Directive is unlikely to mend the fragmented legal framework and achieve the intended high level of data protection standards consistent across European Union member states.
Abstract: This article presents a two-sided analysis of the recently adopted Police and Criminal Justice Authorities Directive. First, it examines the impact of the Directive on the current legal framework and considers to what extent it is capable of overcoming existing obstacles to a consistent and comprehensive data protection scheme in the area of police and criminal justice. Second, it delivers a brief outline and review of the provisions of the Directive itself and explores whether the instrument improves upon the current legislation and sets out adequate data protection rules and standards. Analyzing the Directive from these angles, this article finds that while a considerable improvement and major step forward for the protection of personal data in its field, the Directive is unlikely to mend the fragmented legal framework and achieve the intended high level of data protection standards consistent across European Union member states.
8 citations
[...]
TL;DR: It is argued that while the law regulates both algorithms and their discriminatory effects, the framework is insufficient in addressing the complex interactions that must take place between system developers, users, oversight and profiled individuals to fully guarantee algorithmic transparency and accountability.
Abstract: In the hopes of making law enforcement more effective and efficient, police and intelligence analysts are increasingly relying on algorithms underpinning technologybased and data-driven policing. To achieve these objectives, algorithms must also be accurate, unbiased and just. In this paper, we examine how European data protection law regulates automated profiling and how this regulation impacts police and intelligence algorithms and algorithmic discrimination. In particular, we assess to what extent the regulatory frameworks address the challenges of algorithmic transparency and accountability. We argue that while the law regulates both algorithms and their discriminatory effects, the framework is insufficient in addressing the complex interactions that must take place between system developers, users, oversight and profiled individuals to fully guarantee algorithmic transparency and accountability.
6 citations
Cited by
More filters
Proceedings Article•
[...]
TL;DR: It is argued that providing information regarding how AGSs work can enhance users’ trust only when users have enough time and ability to process and understand the information, and providing excessively detailed information may even reduce users�’ perceived understanding of AGss, and thus hurt users” trust.
Abstract: Users’ adoptions of online-shopping advice-giving systems (AGSs) are crucial for e-commerce websites to attract users and increase profits. Users’ trust in AGSs influences them to adopt AGSs. While previous studies have demonstrated that AGS transparency increases users’ trust through enhancing users’ understanding of AGSs’ reasoning, hardly any attention has been paid to the possible inconsistency between the level of AGS transparency and the extent to which users feel they understand the logic of AGSs’ inner working. We argue that the relationship between them may not always be positive. Specifically, we posit that providing information regarding how AGSs work can enhance users’ trust only when users have enough time and ability to process and understand the information. Moreover, providing excessively detailed information may even reduce users’ perceived understanding of AGSs, and thus hurt users’ trust. In this research, we will use a lab experiment to explore how providing information with different levels of detail will influence users’ perceived understanding of and trust in AGSs. Our study would contribute to the literature by exploring the potential inverted U-shape relationship among AGS transparency, users’ perceived understanding of and trust in AGSs, and contribute to the practice by offering suggestions for designing trustworthy AGSs.
7 citations
[...]
TL;DR: It is discovered that GA has unusual permission requirements and sensitive Application Programming Interface (API) usage, and its privacy requirements are not transparent to smartphone users, which makes the risk assessment and accountability of GA difficult posing risks to establishing private and secure personal spaces in a smart city.
Abstract: Smart Assistants have rapidly emerged in smartphones, vehicles, and many smart home devices. Establishing comfortable personal spaces in smart cities requires that these smart assistants are transparent in design and implementation—a fundamental trait required for their validation and accountability. In this article, we take the case of Google Assistant (GA), a state-of-the-art smart assistant, and perform its diagnostic analysis from the transparency and accountability perspectives. We compare our discoveries from the analysis of GA with those of four leading smart assistants. We use two online user studies (N = 100 and N = 210) conducted with students from four universities in three countries (China, Italy, and Pakistan) to learn whether risk communication in GA is transparent to its potential users and how it affects them. Our research discovered that GA has unusual permission requirements and sensitive Application Programming Interface (API) usage, and its privacy requirements are not transparent to smartphone users. The findings suggest that this lack of transparency makes the risk assessment and accountability of GA difficult posing risks to establishing private and secure personal spaces in a smart city. Following the separation of concerns principle, we suggest that autonomous bodies should develop standards for the design and development of smart city products and services.
6 citations
[...]
01 Jan 2014
TL;DR: In this article, the authors identify data protection shortcomings in the inter-agency cooperation regime in the EU criminal justice and law enforcement area and, under six possible scenarios, the interplay among the data protection legal instruments in the law-making process today in field, as well as, the response each could provide to such shortcomings.
Abstract: This study aims, first, at identifying data protection shortcomings in the inter-agency cooperation regime in the EU criminal justice and law enforcement area and, second, at outlining, under six possible scenarios, the interplay among the data protection legal instruments in the law-making process today in field, as well as, the response each could provide to such shortcomings.
4 citations
[...]
01 Jan 2019
TL;DR: It is argued that instead of setting a uniform rule of providing AGS transparency, optimal transparency provision strategies for different types of AGSs and users based on their unique features should be developed.
Abstract: Advice-giving systems (AGSs) provide recommendations based on users’ unique preferences or needs. Maximizing users’ adoptions of AGSs is an effective way for ecommerce websites to attract users and increase profits. AGS transparency, defined as the extent to which information of a system’s reasoning is provided and made available to users, has been proved to be effective in increasing users’ adoptions of AGSs. While previous studies have identified providing explanations as an effective way of enhancing AGS transparency, most of them failed to further explore the optimal transparency provision strategy of AGSs. We argue that instead of setting a uniform rule of providing AGS transparency, we should develop optimal transparency provision strategies for different types of AGSs and users based on their unique features. In this paper, we first developed a framework of AGS transparency provision and identified six components of AGS transparency provision strategies. We then developed a research model of AGS transparency provision strategy with a set of propositions. We hope that based on this model, researchers could evaluate how to effect transparency for AGSs and users with different characteristics. Our work would contribute to the existing knowledge by exploring how AGS and user characteristics will influence the optimal strategy of providing AGS transparency. Our work would also contribute to the practice by offering design suggestions for AGS explanation interfaces.
4 citations
[...]
TL;DR: This study answers the question how a PIA should be carried out for large-scale digital forensic operations and describes the privacy risks, threats, and articulates concrete privacy measures to demonstrate compliance with the Police Directive.
Abstract: The large increase in the collection of location, communication, health data etc. from seized digital devices like mobile phones, tablets, IoT devices, laptops etc. often poses serious privacy risks. To measure privacy risks, privacy impact assessments (PIA) are substantially useful tools and the Directive EU 2016/80 (Police Directive) requires their use. While much has been said about PIA methods pursuant to the Regulation EU 2016/679 (GDPR), less has been said about PIA methods pursuant to the Police Directive. Yet, little research has been done to explore and measure privacy risks that are specific to law enforcement activities which necessitate the processing of large amounts of data. This study tries to fill this gap by conducting a PIA on a big data forensic platform as a case study. This study also answers the question how a PIA should be carried out for large-scale digital forensic operations and describes the privacy risks, threats we learned from conducting it. Finally, it articulates concrete privacy measures to demonstrate compliance with the Police Directive.
3 citations