scispace - formally typeset
JournalISSN: 2044-3994

International Data Privacy Law 

About: International Data Privacy Law is an academic journal. The journal publishes majorly in the area(s): Data Protection Act 1998 & Information privacy. It has an ISSN identifier of 2044-3994. Over the lifetime, 244 publication(s) have been published receiving 3140 citation(s).

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI

[...]

TL;DR: In this paper, the authors propose a number of legislative and policy steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.
Abstract: Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that the GDPR will legally mandate a ‘right to explanation’ of all decisions made by automated or artificially intelligent algorithmic systems. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive meaningful, but properly limited, information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative and policy steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

332 citations

Journal ArticleDOI

[...]

TL;DR: The right to explanation should be interpreted functionally, flexibly, and should, at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law as mentioned in this paper.
Abstract: There is no single, neat statutory provision labeled the “right to explanation” in Europe’s new General Data Protection Regulation (GDPR). But nor is such a right illusory. Responding to two prominent papers that, in turn, conjure and critique the right to explanation in the context of automated decision-making, we advocate a return to the text of the GDPR. Articles 13-15 provide rights to “meaningful information about the logic involved” in automated decisions. This is a right to explanation, whether one uses the phrase or not. The right to explanation should be interpreted functionally, flexibly, and should, at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law.

142 citations

Journal ArticleDOI

[...]

TL;DR: In this paper, the authors argue that this Regulation, in seeking to remedy some longstanding deficiencies with the DPD as well as more recent issues associated with targeting, profiling, and consumer mistrust, relies too heavily on the discredited informed choice model, and therefore fails to fully engage with the impending Big Data tsunami.
Abstract: ‘Big Data’ refers to novel ways in which organizations, including government and businesses, combine diverse digital datasets and then use statistics and other data mining techniques to extract from them both hidden information and surprising correlations. While Big Data promises significant economic and social benefits, it also raises serious privacy concerns. In particular, Big Data challenges the Fair Information Practices (FIPs), which form the basis of all modern privacy law. Probably the most influential privacy law in the world today is the European Union Data Protection Directive 95/46 EC (DPD). 1 In January 2012, the European Commission (EC) released a proposal to reform and replace the DPD by adopting a new Regulation. 2 In what follows, I argue that this Regulation, in seeking to remedy some longstanding deficiencies with the DPD as well as more recent issues associated with targeting, profiling, and consumer mistrust, relies too heavily on the discredited informed choice model, and therefore fails to fully engage with the impending Big Data tsunami. My contention is that when this advancing wave arrives, it will so overwhelm the core privacy principles of informed choice and data minimization on which the DPD rests that reform efforts will not be enough. Rather, an adequate response must combine legal reform with the encouragement of new business models premised on consumer empowerment and supported by a personal data ecosystem. This new business model is important for two reasons: First, existing business models have proven time and again that privacy regulation is no match for them. Businesses inevitably collect and use more and more personal data, and while consumers realize many benefits in exchange, there is little doubt that businesses, not consumers, control the market in personal data with their own interests in mind. Second, a new business model, which I describe below, promises to stand processing of personal data on its head by shifting control over both the collection and use of data from firms to individuals. This new business model arguably stands a chance of making the FIPs efficacious by giving individuals the capacity to benefit from Big Data and hence the motivation to learn about and control how their data are collected and used. It could also enable businesses to profit from a new breed of services

118 citations

Journal ArticleDOI

[...]

TL;DR: The over-use of notice and consent presents increasing challenges in an age of ‘Big Data’, and these phenomena are receiving attention particularly in the context of the current review of the OECD Privacy Guidelines.
Abstract: Nowadays individuals are often presented with long and complex privacy notices routinely written by lawyers for lawyers, and are then requested to either ‘consent’ or abandon the use of the desired service.The over-use of notice and consent presents increasing challenges in an age of ‘Big Data’.These phenomena are receiving attention particularly in the context of the current review of the OECD Privacy Guidelines.In 2012 Microsoft sponsored an initiative designed to engage leading regulators, industry executives, public interest advocates, and academic experts in frank discussions about the role of individual control and notice and consent in data protection today, and alternative models for providing better protection for both information privacy and valuable data flows in the emerging world of Big Data and cloud computing.

101 citations

Journal ArticleDOI

[...]

TL;DR: In this paper, the authors argue that the current legal reform will fail to revive it, since its three main objectives are based on fallacies, namely the delusion that data protection law can give individuals control over their data, which it cannot, and the misconception that the reform simplifies the law, while in fact it makes compliance even more complex.
Abstract: • The trouble with European data protection law, as with Alfred Hitchcock's Harry, is that it is dead. The current legal reform will fail to revive it, since its three main objectives are based on fallacies. • The first fallacy is the delusion that data protection law can give individuals control over their data, which it cannot. The second is the misconception that the reform simplifies the law, while in fact it makes compliance even more complex. The third is the assumption that data protection law should be comprehensive, which stretches data protection to the point of breaking and makes it meaningless law in the books. • Unless data protection reform starts looking in other directions—going back to basics, playing other regulatory tunes on different instruments in other legal areas, and revitalising the spirit of data protection by stimulating best practices—data protection will remain dead. Or, worse perhaps, a zombie.

84 citations

Network Information
Related Journals (5)
Computer Law & Security Review
1.2K papers, 14.6K citations
78% related
Artificial Intelligence and Law
459 papers, 11.8K citations
76% related
Social Science Research Network
551.8K papers, 10.8M citations
73% related
Ethics and Information Technology
692 papers, 21.5K citations
73% related
First Monday
1.9K papers, 56.7K citations
72% related
Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202110
20209
201917
201824
201725
201620