scispace - formally typeset
Search or ask a question

Showing papers in "International Data Privacy Law in 2015"





Journal ArticleDOI
TL;DR: Extraterritoriality in EU regulation of international data transfers is intrinsically neither good nor bad; rather, its appropriateness depends on how it is used and implemented as discussed by the authors.
Abstract: Use of the term “extraterritorial” to describe the regulation of international transfers of personal data in EU data protection law has led to confusion about the scope of such regulation. Any distinction between extraterritoriality “in scope” and “in effect” has become meaningless. Extraterritoriality in EU regulation of international data transfers is intrinsically neither good nor bad; rather, its appropriateness depends on how it is used and implemented. Regulation of international data transfers in EU data protection law tends to apply in a “black or white” fashion, without the safety valves necessary to prevent jurisdictional overreaching. This leads to increasing conflicts between EU law and the law of third countries. Attention should turn from deciding whether a particular exercise of jurisdiction is extraterritorial, to determining the conditions under which it can be appropriate. The controversy surrounding extraterritoriality illustrates the need to set boundaries to the application of EU data protection law.

21 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that, in most circumstances, the only available legal basis for the processing of personal data for behavioural targeting is the data subject's unambiguous consent and that the cookie consent requirement from the e-Privacy Directive does not provide a legal basis.
Abstract: Key Points •The European Union Charter of Fundamental Rights only allows personal data processing if a data controller has a legal basis for the processing. •This paper argues that, in most circumstances, the only available legal basis for the processing of personal data for behavioural targeting is the data subject's unambiguous consent. •Furthermore, the paper argues that the cookie consent requirement from the e-Privacy Directive does not provide a legal basis for the processing of personal data. •Therefore, even if companies could use an opt-out system to comply with the e-Privacy Directive's consent requirement for using a tracking cookie, they would generally have to obtain the data subject's unambiguous consent if they process personal data for behavioural targeting.

20 citations


Journal ArticleDOI
TL;DR: In this article, the authors examine the legal and regulatory framework for self-quantified health information and wearable devices in Australia and determine the extent to which this framework addresses privacy and other concerns that these techniques engender.
Abstract: This exploratory article examines the phenomenon of the ‘Quantified Self’—until recently, a subculture of enthusiasts who aim to discover knowledge about themselves and their bodies through self-tracking, usually using wearable devices to do so—and its implications for laws concerned with regulating and protecting health information. Quantified Self techniques and the ‘wearable devices’ and software that facilitate them—in which large transnational technology corporations are now involved—often involve the gathering of what would be considered ‘health information’ according to legal definitions, yet may occur outside the provision of traditional health services (including ‘e-health’) and the regulatory frameworks that govern them. This article explores the legal and regulatory framework for self-quantified health information and wearable devices in Australia and determines the extent to which this framework addresses privacy and other concerns that these techniques engender, along with suggestions for reform.

16 citations


Journal ArticleDOI
TL;DR: The consortium adapted a privacy impact assessment framework to address the particularities of smart surveillance systems, technologies, projects, and policies and extracted the best elements of existing PIA methodologies in order to construct a surveillancesuitable PIA framework.
Abstract: More and more business—not only Facebook and Google—are surveilling users while security agencies argue that more data means more effective law enforcement. With the increasing pervasiveness of mass surveillance, there is a clear need for a surveillance impact assessment (SIA), a method that addresses not only issues of privacy and data protection, but also ethical, social, economic, and political issues. This paper describes the development and testing of an SIA, which was developed in the SAPIENT project, which was funded by the European Commission, and undertaken by a consortium of European partners. The specific aim of the SAPIENT project was to provide for policy-makers, developers of surveillance technology, and other stakeholders strategic knowledge on the state of the art of surveillance studies, emerging smart surveillance technologies, and the adequacy of the existing legal framework. The consortium developed scenarios around future smart surveillance systems for discussion with focus groups of stakeholders aimed at providing a consolidated analysis of stakeholder views on the use of surveillance and at informing the development of an SIA. The consortium adapted a privacy impact assessment (PIA) framework to address the particularities of smart surveillance systems, technologies, projects, and policies. To that end, it extracted the best elements of existing PIA methodologies in order to construct a surveillancesuitable PIA framework (ie an SIA methodology), which it tested on four different surveillance projects. It then derived lessons learned to refine its proposed methodology and to present its results at a final conference and in a final report together with its recommendations. The first step towards development of the SIA was a state-of-the-art review of surveillance technologies.

14 citations






Journal ArticleDOI
TL;DR: The proposed EU Data Protection Regulation tackles this problem by introducing a requirement for data controllers providing information society services directed at children: the controllers should obtain parental consent in cases where the personal data of a child under the age of 13 are being processed in the online environment.
Abstract: Empirical data show that increasingly younger children are engaging in online activities. Many children lack experience and knowledge of the implications and practices related to personal data management. However, under the current Euro- pean data protection regime, children become data subjects on the same legal basis as adults, with consent being the most popular method of obtaining personal data. The proposed EU Data Protection Regulation tackles this problem by introducing a requirement for data controllers providing information society services directed at children: the controllers should obtain parental consent in cases where the personal data of a child under the age of 13 are being processed in the online environment. In the view of the European Commission, this requirement will reduce online risks for children and prevent them from making ‘youthful’ indiscretions.


Journal ArticleDOI
TL;DR: In this paper, the authors describe a ten-country European study investigating the practical aspects of exercising access rights from the perspective of data subjects, and make key recommendations to assist data subjects in their attempts to exercise access rights.
Abstract: • This article describes a ten-country European study investigating the practical aspects of exercising access rights from the perspective of data subjects. • It uses a mixture of quantitative and qualitative methodology to illustrate the restrictions faced by data subjects in exercising their access rights. • It concludes by making key recommendations to assist data subjects in their attempts to exercise access rights.




Journal ArticleDOI
TL;DR: This research looks at the EU’s obligations to protect the fundamental right to data protection extraterritorially under international human rights law (IHRL) to determine the obligatory, as an extension of the permissive, application of law in public international law (PIL) terms, the exercise of prescriptive jurisdiction.
Abstract: With the ubiquity of the internet and the rise of digitised personal data, data controllers and processors are processing ever more personal data, foregrounding the need to ensure that these data are protected. The EU, compared with most states, strongly advocates the importance of protecting personal data. Indeed, the EU has the world’s longest standing and is also often considered to have the strictest, and certainly the most influential, data protection law. The EU pushes its approach to data protection aggressively and has therefore gained dominance as a legal actor in this field. A form of territorial extension is evident in data protection law. The law of one jurisdiction, namely the EU, has become and is becoming the rule in other places for several reasons, including economic ease, accession goals, convenience, regulatory arbitrage, and potentially the protection of human rights. EU representatives often use fundamental rights rhetoric to promote its data protection law. This legal diffusion even suggests an overriding data protection norm; however, there is no clear evidence of the existence of such an all-encompassing, widely accepted norm outside the EU. This research looks at the EU’s obligations to protect the fundamental right to data protection extraterritorially under international human rights law (IHRL). It conceives of the EU as a duty bearer: the Union exercises jurisdiction, will become a party to the European Convention on Human Rights (ECHR) and is arguably becoming a human rights actor in its own right. Whilst data protection in the EU was initially conceived of in market terms, it is increasingly connected with fundamental rights. The growing weight of the fundamental right to data protection in the EU is arguably linked to the increased territorial extension of EU data protection law. This raises questions of how to apply a fundamental right in the EU to a virtual, borderless space and ultimately third states. This research focusses mostly on IHRL to determine the obligatory, as an extension of the permissive, application of law in public international law (PIL) terms, that is, the exercise of prescriptive jurisdiction. It asks to what extent the EU is obliged to exercise territorial extension of its laws to protect the fundamental right to data protection for its citizens. The research begins by looking briefly at the extraterritorial

Journal ArticleDOI
TL;DR: Extraterritoriality and targeting in EU data privacy law : the weak spot undermining the regulation.
Abstract: Extraterritoriality and targeting in EU data privacy law : the weak spot undermining the regulation

Journal ArticleDOI
TL;DR: The case study shows the importance of the adoption of a multi-stakeholder approach to define privacy-oriented mobility systems in line with the idea of a participatory and inclusive smart community.
Abstract: Smart mobility systems make it possible to collect a huge amount of information about passenger's movements that can reveal users' behaviour and social relations. In the light of the above, these systems should adopt adequate privacy-oriented solutions in order to avoid the risk of transforming mobility architecture into generalized territorial surveillance systems. Adopting a case study approach, the article analyses the role played by data management, data anonymization, and pseudonymization in reducing the potential negative impact of e-ticketing technologies on individual and social privacy. In a context characterized by public subsidies provided by regional governments to transport companies, the author shows how privacy-oriented strategies adopted by regional governments are an important factor to induce mobility policies that focus on service quality and citizen privacy protection. Finally, the case study shows the importance of the adoption of a multi-stakeholder approach to define privacy-oriented mobility systems in line with the idea of a participatory and inclusive smart community

Journal ArticleDOI
TL;DR: In this paper, Gramsci once wrote that "the old is dying and the new cannot be born; in this interregnum, a great variety of morbid symptoms appear" and this statement could apply to the current state of data protection regulation around the world.
Abstract: The Italian Marxist theorist Antonio Gramsci once wrote (in translation) that ‘the old is dying and the new cannot be born; in this interregnum, a great variety of morbid symptoms appear’. Although Gramsci was not speaking about data privacy, it seems to us that this statement could apply to the current state of data protection regulation around the world, which is marked by a realization that existing regulatory models are not working effectively, the lack of political will to explore alternatives, and general frustration about how to improve the situation. This has led to a credibility gap between the objectives of data protection law and how personal data are protected in practice. It should not be this way. The importance of data privacy has never been greater, and countries and regional organizations around the world are enacting legislation in an attempt to protect it. Much of this legislation has been based on the EU Data Protection Directive 95/46, which will be replaced by the proposed EU General Data Protection Regulation if the EU can ever finalize its interminable legislative process. Even the White House, which for years had seemed to oppose any large-scale federal legislation to deal with data processing in the private sector, has called for enactment of a Consumer Privacy Bill of Rights Act to grant increased protection to the online processing of personal data. Regional organizations such as Asia-Pacific Economic Cooperation, the Council of Europe, the Organization of American States, the Economic Community of West African States, the Organisation for Economic Co-operation and Development, and others have also done extensive work to enact new privacy instruments or amend their existing ones. All this activity has also had an effect at the global level, with the UN General Assembly passing a resolution that affirms the ‘right to privacy in the digital age’. But the increasing amount of new data protection regulation raises an important point: is all of this making any difference in increasing the protection of data privacy in practice? There are three aspects to this question that we would like to discuss briefly here. First of all, there is general confusion about the correct approach to regulating the collection, processing, and use of personal data. Among the issues about which there is no global consensus are how effective the law can be in regulating online data processing; the correct balance between legal regulation and private sector selfregulation; and how best to enforce the law. To a large extent, these questions are not unique to data protection and tend to arise in any area that involves the regulation of technology. But coming to a consensus about them has proved intractable, as they often reflect differences in national and regional laws and cultures. Secondly, the globalization of data processing creates major problems for applying and enforcing the law. The fact that it is increasingly difficult to determine the location where data are collected and processed gives rise to confusion on the part of individuals about what their rights are and how they can exercise control over their data in a meaningful way. Data controllers are similarly frustrated by the application of multiple laws to a particular database or online service, and regulators and governments are often unable to apply and enforce their laws across national borders, which can lead to international tensions. Thirdly, questions arise about how data protection regulation is enforced. Recent years have witnessed what could be called the ‘FTC-ization’ of data privacy enforcement, which reflects the strategy of the US Federal Trade Commission to concentrate on enforcement in highprofile cases, in order to make examples of the corporations involved and frighten others into compliance. Other regulators, such as European data protection authorities, have adopted a similar approach, at least in part because they lack the resources to enforce the law on a more widespread scale. However, while this may generate enforcement efficiencies, it raises questions

Journal ArticleDOI
TL;DR: The article seeks to determine jurisdictional issues regarding data protection litigation within the european union, elaborating concretely on potential competent courts in the case a data subject wants to file a private enforcement claim against a controller processing his personal data.
Abstract: Key pointsthe objective of the present article is to point out current and potential obstacles to effective protection of the fundamental right to data protection, created by rules on jurisdiction, and to put forward solutions for removing those obstacles with regard to data protection.the article elaborates on categories of litigation in the field of data protection in order to identify potential claimants, defendants, and competent administrative and judicial authorities that may decide on those remedies.building upon these categories of litigation, the article seeks to determine jurisdictional issues regarding data protection litigation within the european union, elaborating concretely on potential competent courts in the case a data subject wants to file a private enforcement claim against a controller processing his personal data.the article proposes to amend the regulation 1215/2012 so as to include a special section determining jurisdiction in data protection disputes.



Journal ArticleDOI
TL;DR: There is a need to clarify what ‘biometrics’ means for the biometric community and whether and how the legal community should use the term in a data protection and privacy context and the current legal definition of ’biometric data’ is investigated.
Abstract: This article has been motivated by an observation: the lack of rigor by European bodies when they use scientific terms to address data protection and privacy issues raised by biometric technologies and biometric data. In particular, they improperly use the term ‘biometrics’ to mean at the same time ‘biometric data’, ‘identification method’, or ‘biometric technologies’. Based on this observation, there is a need to clarify what ‘biometrics’ means for the biometric community and whether and how the legal community should use the term in a data protection and privacy context. In parallel to that exercise of clarification, there is also a need to investigate the current legal definition of ‘biometric data’ as framed by European bodies at the level of the European Union and the Council of Europe. The comparison of the regulatory and scientific definitions of the term ‘biometric data’ reveals that the term is used in two different contexts. However, it is legitimate to question the role that the scientific definition could exercise on the regulatory definition. More precisely, the question is whether the technical process through which biometric information is extracted and transformed into a biometric template should be reflected in the regulatory definition of the term.

Journal ArticleDOI
TL;DR: Given that modern video surveillance technology works at the level of abstracted objects rather than raw video streams, it is argued that the computer vision capabilities of such systems can also be exploited for improving the selectiveness of surveillance measures.
Abstract: Intelligent video surveillance is an active and lively field of research, predominantly in the domains of image exploitation and situation assessment. The availability of privacy-invasive system functionality such as real-time object tracking and automatic extraction of biometric features is becoming reality. Not surprisingly, video surveillance generates an increasing interest among information security and privacy researchers. A categorical argument against video surveillance targets the chilling effect of such systems, which arguably is in conflict with the fundamental right to free development of the individual. When faced with surveillance cameras, we cannot know whether we are currently observed or not. However, the mere possibility of being observed tends to change the way we behave, which usually is considered an undesired phenomenon in free societies and therefore addressed by legislation. The principle of proportionality, as laid down in Articles 8(2) and 52(1) of the Charter of Fundamental Rights of the European Union, demands a careful weighing of the purpose of a surveillance measure, ie, the legally protected interest to be defended, against the legitimate interests of people affected by the surveillance measure. However, we do observe that video surveillance is spreading rapidly, even though the proportionality of privacy invasion and utility may not always be justified. In addition, even if we consider video surveillance to be necessary in particular cases, the question of how and to which extent privacy of the people concerned can be preserved must be evaluated. Given that modern video surveillance technology works at the level of abstracted objects rather than raw video streams, we argue that the computer vision capabilities of such systems can also be exploited for improving the selectiveness of surveillance measures. Intelligent video surveillance systems are capable of fusing information extracted from video streams into abstracted objects, including attributes such as IDs by face recognition, location, or certain activities. Hence, we can analogously incorporate an authentication mechanism as an information source, which enables the system to determine (group) identities of people who are a priori known to be concerned by the surveillance measure, eg, employees of an airport as an environment,




Journal ArticleDOI
TL;DR: The present legal safeguards to data privacy in Ethiopia are examined along with major practical challenges and suggestions towards a robust data privacy regime are suggested.
Abstract: Ethiopia has recognized the right to privacy throughout its brief constitutional history, albeit to different degrees. All Ethiopian Constitutions, including the first written Constitution of 1931, have had provisions dedicated to the right to privacy. Comprehensive privacy safeguards were, however, introduced by the Constitution of 1995, which protects privacy of persons, their home, and correspondences in a detailed manner. The constitutional guarantees to privacy are furthered in subordinate instruments that refine details of the protection. These fairly robust constitutional guarantees to privacy are, nevertheless, recently being undermined and eroded by the myriad of ill-conceived privacy-unfriendly laws the country has recently introduced and surveillance practices conducted in the absence of any oversight mechanisms. Recent reports have implicated the government for allegedly undertaking massive surveillances, interception of electronic communications, and even cyberattacks on members of opposition groups and journalists. This article examines the present legal safeguards to data privacy along with major practical challenges and proffers suggestions towards a robust data privacy regime.