scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The Long Arm of EU Data Protection Law: Does the Data Protection Directive Apply to Processing of Personal Data of EU citizens by Websites Worldwide?

01 Feb 2011-International Data Privacy Law (Oxford University Press)-Vol. 1, Iss: 1, pp 28-46
TL;DR: In this article, the key concepts of the provision for applicability of EU data protection laws to non-EU websites and the uniform interpretation thereof based on the legislative history of the Directive are discussed.
Abstract: † Discusses the key concepts of the provision for applicability of EU data protection laws to nonEU websites and provides for a uniform interpretation thereof based on the legislative history of the Directive. † Discusses the differences in the manner in which the applicability rule is implemented in the Member States and the resulting divergent interpretations by the national Data Protection Authorities.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
Maja Brkan1
TL;DR: The article argues that the GDPR obliges the controller to inform the data subject of the reasons why an automated decision was taken and argues that such a right would in principle fit well within the broader framework of theGDPR's quest for a high level of transparency.
Abstract: The purpose of this article is to analyse the rules of the General Data Protection Regulation (GDPR) and the Directive on Data Protection in Criminal Matters on automated decision-making and to explore how to ensure transparency of such decisions, in particular those taken with the help of algorithms. Both legal acts impose limitations on automated individual decision-making, including profiling. While these limitations of automated decisions might come across as a forceful fortress strongly protecting individuals and potentially even hampering the future development of Artificial Intelligence in decision-making, the relevant provisions nevertheless contain numerous exceptions allowing for such decisions. While the Directive on Data Protection in Criminal Matters worryingly does not seem to give the data subject the possibility to familiarize herself with the reasons for such a decision, the GDPR obliges the controller to provide the data subject with 'meaningful information about the logic involved' (Articles 13(2)(f), 14(2)(g) and 15(1)(h)), thus raising the much-debated question whether the data subject should be granted a 'right to explanation' of the automated decision. This article seeks to go beyond the semantic question of whether this right should be designated as the 'right to explanation' and argues that the GDPR obliges the controller to inform the data subject of the reasons why an automated decision was taken. While such a right would in principle fit well within the broader framework of the GDPR's quest for a high level of transparency, it also raises several queries: What exactly needs to be revealed to the data subject? How can an algorithm-based decision be explained? The article aims to explore these questions and to identify challenges for further research regarding explainability of automated decisions.

47 citations

Journal ArticleDOI
TL;DR: This dissertation addresses some of the data protection challenges that have arisen from globalization, technological progress, terrorism and seamless cross-border flows of personal data.
Abstract: This dissertation addresses some of the data protection challenges that have arisen from globalization, technological progress, terrorism and seamless cross-border flows of personal data. The focu ...

42 citations

18 May 2020
TL;DR: Information by design helps curbing the information asymmetry inherent to the deployment of smart city technologies and can be implemented through the tactics of supplying information about the processing to data subjects, explaining its logic, and notifying individual data subjects about processing events that specifically concern them.
Abstract: ion is a third strategy through which the values underlying privacy and data protection can be safeguarded by designing technologies as to limit, as much as possible in light of the processing purposes, the detail in which personal data is processed. As opposed to minimisation, abstraction does not limit the quantity of the data processed, but its granularity. Abstraction can be implemented by summarising certain attributes into less granular versions (e.g. by processing whether a data subject is a minor or not rather than the exact age), by grouping individual data into an aggregate group profile, or by perturbing the personal data processed by approximation or through the addition of random noise. A relevant pattern would be for instance location data fuzzing, 314 through which the accuracy of location information is decreased in a way that aims at preserving its general utility. Let us say that a smart city project requires the monitoring of individuals’ location data to identify aggregated movement patterns to be used as a basis for data-driven spatial planning: to what extent can the information capture technologies deployed abstract the data gathered while still being able to fulfil their goals? Even when personal data must be collected, and regardless of the level of detail necessary to achieve the processing’s purposes, data subjects’ information can be concealed. Hiding personal data, a last data-oriented strategy, can be achieved through several tactics: access restrictions, obfuscation (e.g. through encryption), disassociation to prevent linkability, and mixing it to hide its origin or its relationship with other data. The use of pseudonymous identities315 is a design pattern that well exemplifies how hiding personal information can be a valid design strategy – if a smart city service provider does not need to know a user’s real identity, it may very well be required to deal with a pseudonym, so that the identification of the data subject’s identity would need an additional step (and further transaction costs) to be performed. Likewise, when users’ inputs can lead to privacy breaches, a system that foresees the possibility to generate fake inputs that cannot be distinguished from real inputs so that the system operator cannot identify an unidentified data subject or infer the attributes of an identified data subject316 is a viable privacy pattern to implement the ‘hide’ strategy. Aside from the data-oriented privacy by design strategies and tactics above, there are also a number of process-oriented options through which the values underlying privacy and data protection can be transposed into the technologies that make up the technological layer of the smart city construct. The provision of information by design helps curbing the information asymmetry inherent to the deployment of smart city technologies. Informing data subjects, timely and adequately, about the processing of their personal data is a first process-oriented strategy, and can be implemented through the tactics of supplying information about the processing to data subjects, explaining its logic, and notifying individual data subjects about processing events that specifically concern them. Examples of design patterns that implement 313 See https://privacypatterns.org/patterns/Personal-data-store. 314 See https://privacypatterns.org/patterns/Location-granularity. 315 See https://privacypatterns.org/patterns/Pseudonymous-identity. 316 See https://privacypatterns.org/patterns/Use-of-dummies.

31 citations

Journal ArticleDOI
TL;DR: In this article, the authors trace the formulation of the General Data Protection Regulation to show that the regulation occupied the legislative agenda when a policy window was exploited through policy entrepreneurship to frame technological change as a problem for data privacy and legislative harmonization within the European Union.
Abstract: Why and how the regulation of emerging technologies occurs is not clear in the literature. In this study, we adapt the multiple streams framework – often used for explaining agenda-setting and policy adoption – to examine the phenomenon. We hypothesize how technological change affects policy-making and identify conditions under which the streams can be (de-)coupled. We trace the formulation of the General Data Protection Regulation to show that the regulation occupied the legislative agenda when a policy window was exploited through policy entrepreneurship to frame technological change as a problem for data privacy and legislative harmonization within the European Union. Although constituencies interested in promoting internet technologies made every effort to stall the regulation, various actors, activities, and events helped the streams remain coupled, eventually leading to its adoption. We conclude that the alignment of problem, policy, politics, and technology – through policy entrepreneurship – influences the timing and design of technology regulation.

29 citations

Journal ArticleDOI
TL;DR: The content analysis revealed that the available health data protection laws are limited in scope and the survey results showed that the respondents felt that they could trust the e-health services systems offered in the UAE as the data collected is protected, the rights are not violated.
Abstract: The move toward e-health care in various countries is envisaged to reduce the cost of provision of health care, improve the quality of care and reduce medical errors. The most significant problem is the protection of patients’ data privacy. If the patients are reluctant or refuse to participate in health care system due to lack of privacy laws and regulations, the benefit of the full-fledged e-health care system cannot be materialized. The purpose of this paper is to investigate the available e-health data privacy protection laws and the perception of the people using the e-health care facilities.,The researchers used content analysis to analyze the availability and comprehensive nature of the laws and regulations. The researchers also used survey method. Participants in the study comprised of health care professionals (n=46) and health care users (n=187) who are based in the Dubai, United Arab Emirates. The researchers applied descriptive statistics mechanisms and correlational analysis to analyze the data in the survey.,The content analysis revealed that the available health data protection laws are limited in scope. The survey results, however, showed that the respondents felt that they could trust the e-health services systems offered in the UAE as the data collected is protected, the rights are not violated. The research also revealed that there was no significance difference between the nationality and the privacy data statements. All the nationality agreed that there is protection in place for the protection of e-health data. There was no significance difference between the demographic data sets and the many data protection principles.,The findings on the users’ perception could help to evaluate the success in realizing current strategies and an action plan of benchmarking could be introduced.

11 citations