Bio: Juraj Sajfert is an academic researcher. The author has contributed to research in topic(s): Criminal justice & Crime prevention. The author has an hindex of 1, co-authored 1 publication(s) receiving 5 citation(s).
TL;DR: In this article, the authors present a general overview of the LED structure, its scope and briefly present the ten chapters, focusing on automated individual decision-making (Article 11), indirect exercise of data subject rights (Article 17), obligation for competent authorities to keep logs (Article 25), and the provisions on international transfers (Articles 35-39).
Abstract: In May 2018, the EU Data Protection Reform became applicable. In particular the General Data Protection Regulation (EU) 2016/679 (GDPR), which repealed Directive 95/46/EC on 25 May 2018 attracted much attention, albeit resembling its predecessor. Alongside the GDPR, the Reform encompassed a Directive (EU) 2016/680 establishing rules for the protection of individuals with regard to the processing of personal data by competent authorities for purposes of law enforcement (LED). The LED put forth great achievements in areas that had previously not been regulated by EU law. Therefore, the LED constitutes a major step forward in establishing a comprehensive EU data protection regime, as the first horizontal and legally binding instrument laying down the rules for national and cross-border processing of personal data in the area of law enforcement. Although the LED received way less attention than the GDPR, two main objectives of the Directive are too important to be neglected: the increased level of fundamental rights protection in the area of police and criminal justice, and the improved sharing of personal data between the Member States, as they will be able to rely on uniform data protection rules. The LED is a modern instrument, designed for the processing of personal data by Law Enforcement Authorities (LEAs) in the Digital Age, which is why there is no doubt that instruments such as the Directive are rapidly gaining in importance and visibility: firstly, an increased number of criminal acts is being committed online or with the help of online tools. Perpetrators of crime leave digital traces that may support LEAs in their tasks of crime prevention, detection, investigation and prosecution. Secondly, as perpetrators are becoming more tech-savvy, LEAs turned to new investigative techniques, including big data analytics. The term big data police technologies may include predictive systems that identify people or places suspected of crime, surveillance systems to monitor at-risk areas and search systems to mine data for investigative clues or to develop intelligence nets of helpful data for groups or across communities. Thirdly, EU rules on the processing of personal data by LEAs are undergoing consolidation, with the LED acting as a locomotive. The rules of the LED, which had to be transposed into the national laws of all 28 Member States and the four Schengen Area States (Norway, Iceland, Switzerland and Lichtenstein) by 6 May 2018, benefited from the attention given to the GDPR, as some of the Regulation’s solutions could simply be taken over. However, a number of provisions were developed specifically for the LED. This Article will give a general overview of the LED structure, its scope and briefly present the ten chapters. The second part of the Article will address four distinct features enshrined in the LED, focussing on automated individual decision-making (Article 11), the indirect exercise of data subject rights (Article 17), the obligation for competent authorities to keep logs (Article 25) and the provisions on international transfers (Articles 35-39). It will thereby critically assess the articles and point to both similarities and differences with regard to the GDPR.
01 Feb 2021-Technology in Society
TL;DR: A comparison of travelers' and border authorities' views on the deployment of biometric technologies in border management is provided to enable us to understand the concerns of travelers and border guards in order to facilitate the acceptance ofBiometric technologies for a secure and more convenient border crossing.
Abstract: Advances in technology have a substantial impact on every aspect of our lives, ranging from the way we communicate to the way we travel. The Smart mobility at the European land borders (SMILE) project is geared towards the deployment of biometric technologies to optimize and monitor the flow of people at land borders. However, despite the anticipated benefits of deploying biometric technologies in border control, there are still divergent views on the use of such technologies by two primary stakeholders–travelers and border authorities. In this paper, we provide a comparison of travelers' and border authorities' views on the deployment of biometric technologies in border management. The overall goal of this study is to enable us to understand the concerns of travelers and border guards in order to facilitate the acceptance of biometric technologies for a secure and more convenient border crossing. Our method of inquiry consisted of in-person interviews with border guards (SMILE project's end users), observation and field visits (to the Hungarian-Romanian and Bulgarian-Romanian borders) and questionnaires for both travelers and border guards. As a result of our investigation, two conflicting trends emerged. On one hand, border guards argued that biometric technologies had the potential to be a very effective tool that would enhance security levels and make traveler identification and authentication procedures easy, fast and convenient. On the other hand, travelers were more concerned about the technologies representing a threat to fundamental rights, personal privacy and data protection.
02 Jan 2021-Ai & Society
TL;DR: In this paper, the analysis of strategic documents at the level of the European Union (EU) concerning the regulation of artificial intelligence as one of the so-called disruptive technologies is presented, where the focus of the paper is devoted to issues of personal data protection and cyber security included in these strategic documents.
Abstract: The presented paper focuses on the analysis of strategic documents at the level of the European Union (EU) concerning the regulation of artificial intelligence as one of the so-called disruptive technologies. In the first part of the article, we outline the basic terminology. Subsequently, we focus on the summarizing and systemizing of the key documents adopted at the EU level in terms of artificial intelligence regulation. The focus of the paper is devoted to issues of personal data protection and cyber security included in these strategic documents. The final part contains recommendations for future research and evaluation of its key features.
TL;DR: In this paper, the authors show that due to the absence of LED adequacy decisions, personal data transfers to law enforcement authorities in third countries often occur without the appropriate scrutiny and safeguards due to system the LED establishes.
Abstract: In May 2018, EU data protection rules were not only reformed by the General Data Protection Regulation (GDPR) but also by the Law Enforcement Directive (LED). While the LED is often overshadowed by the GDPR, it nevertheless did introduce a number of crucial reforms to data protection in a law enforcement context in the EU including harmonized rules on how personal data in a law enforcement context can be transferred to other law enforcement authorities in third countries. Formally the LED rules on international transfers of personal data to third countries aim at guaranteeing that the level of protection for personal data in a law enforcement context within the EU is not undermined as soon as personal data leaves EU territory. Taking a closer look however reveals major issues with the rules foreseen for transfers in the LED as they often come down to law enforcement authorities self-assessing whether a third country would offer adequate protection within the meaning of the standard of essential equivalence as established by the Court of Justice of the European Union (CJEU) in Schrems. In this paper, I show, by relying on EU fundamental rights law and the case law of the CJEU, how due to the absence of LED adequacy decisions, personal data transfers to law enforcement authorities in third countries often occur without the appropriate scrutiny and safeguards due to system the LED establishes. Using the recent reference to the CJEU by a German Court regarding information exchanges with Interpol, I demonstrate how the created legal uncertainty can affect both the work of law enforcement authorities and the fundamental rights of individuals. I conclude that the current system for international personal data transfers within the LED is deeply flawed and potentially undermining EU personal data protection in a law enforcement context.
28 Sep 2020
TL;DR: In this paper, the authors present an example of privacy and data protection best practices to provide more guidance for data controllers and developers on how to comply with the legal obligation for data protection.
Abstract: Biometric recognition is a highly adopted technology to support different kinds of applications, ranging from security and access control applications to low enforcement applications. However, such systems raise serious privacy and data protection concerns. Misuse of data, compromising the privacy of individuals and/or authorized processing of data may be irreversible and could have severe consequences on the individual’s rights to privacy and data protection. This is partly due to the lack of methods and guidance for the integration of data protection and privacy by design in the system development process. In this paper, we present an example of privacy and data protection best practices to provide more guidance for data controllers and developers on how to comply with the legal obligation for data protection. These privacy and data protection best practices and considerations are based on the lessons learned from the SMart mobILity at the European land borders (SMILE) project.
TL;DR: Light is shed on technical challenges and misconceptions as well as legal shortcomings to foster a common understanding of the challenges to find out how they might be addressed and a change in the LED is proposed.
Abstract: Law enforcement increasingly relies on complex machine learning approaches to support investigations. With limited knowledge and funding LEAs often depend on opaque privatepublic collaborations. Failure to provide legal bases on the national level paired with shortcomings both in the GDPR and Directive EU-2016/680 (LED) result in severe risks for fundamental rights of EU citizens. To overcome these risks an interdisciplinary discussion is required. This paper hence sheds light on technical challenges and misconceptions as well as legal shortcomings to foster a common understanding of the challenges to find out how they might be addressed. To do so, the author searches for common ground of ‘public availability’ and reviews currently used technical approaches and common processing constellations. Based on the outcomes, the author proposes a change in the LED and discusses a centralised institution to govern access to novel data driven technology.