scispace - formally typeset
Search or ask a question
JournalISSN: 2044-3994

International Data Privacy Law 

Oxford University Press
About: International Data Privacy Law is an academic journal published by Oxford University Press. The journal publishes majorly in the area(s): Data Protection Act 1998 & Information privacy. It has an ISSN identifier of 2044-3994. Over the lifetime, 269 publications have been published receiving 3955 citations. The journal is also known as: IDPL.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors propose a number of legislative and policy steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.
Abstract: Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that the GDPR will legally mandate a ‘right to explanation’ of all decisions made by automated or artificially intelligent algorithmic systems. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive meaningful, but properly limited, information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative and policy steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

442 citations

Journal ArticleDOI
TL;DR: The right to explanation should be interpreted functionally, flexibly, and should, at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law as mentioned in this paper.
Abstract: There is no single, neat statutory provision labeled the “right to explanation” in Europe’s new General Data Protection Regulation (GDPR). But nor is such a right illusory. Responding to two prominent papers that, in turn, conjure and critique the right to explanation in the context of automated decision-making, we advocate a return to the text of the GDPR. Articles 13-15 provide rights to “meaningful information about the logic involved” in automated decisions. This is a right to explanation, whether one uses the phrase or not. The right to explanation should be interpreted functionally, flexibly, and should, at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law.

206 citations

Journal ArticleDOI
TL;DR: In this paper, the authors argue that this Regulation, in seeking to remedy some longstanding deficiencies with the DPD as well as more recent issues associated with targeting, profiling, and consumer mistrust, relies too heavily on the discredited informed choice model, and therefore fails to fully engage with the impending Big Data tsunami.
Abstract: ‘Big Data’ refers to novel ways in which organizations, including government and businesses, combine diverse digital datasets and then use statistics and other data mining techniques to extract from them both hidden information and surprising correlations. While Big Data promises significant economic and social benefits, it also raises serious privacy concerns. In particular, Big Data challenges the Fair Information Practices (FIPs), which form the basis of all modern privacy law. Probably the most influential privacy law in the world today is the European Union Data Protection Directive 95/46 EC (DPD). 1 In January 2012, the European Commission (EC) released a proposal to reform and replace the DPD by adopting a new Regulation. 2 In what follows, I argue that this Regulation, in seeking to remedy some longstanding deficiencies with the DPD as well as more recent issues associated with targeting, profiling, and consumer mistrust, relies too heavily on the discredited informed choice model, and therefore fails to fully engage with the impending Big Data tsunami. My contention is that when this advancing wave arrives, it will so overwhelm the core privacy principles of informed choice and data minimization on which the DPD rests that reform efforts will not be enough. Rather, an adequate response must combine legal reform with the encouragement of new business models premised on consumer empowerment and supported by a personal data ecosystem. This new business model is important for two reasons: First, existing business models have proven time and again that privacy regulation is no match for them. Businesses inevitably collect and use more and more personal data, and while consumers realize many benefits in exchange, there is little doubt that businesses, not consumers, control the market in personal data with their own interests in mind. Second, a new business model, which I describe below, promises to stand processing of personal data on its head by shifting control over both the collection and use of data from firms to individuals. This new business model arguably stands a chance of making the FIPs efficacious by giving individuals the capacity to benefit from Big Data and hence the motivation to learn about and control how their data are collected and used. It could also enable businesses to profit from a new breed of services

128 citations

Journal ArticleDOI
TL;DR: The over-use of notice and consent presents increasing challenges in an age of ‘Big Data’, and these phenomena are receiving attention particularly in the context of the current review of the OECD Privacy Guidelines.
Abstract: Nowadays individuals are often presented with long and complex privacy notices routinely written by lawyers for lawyers, and are then requested to either ‘consent’ or abandon the use of the desired service.The over-use of notice and consent presents increasing challenges in an age of ‘Big Data’.These phenomena are receiving attention particularly in the context of the current review of the OECD Privacy Guidelines.In 2012 Microsoft sponsored an initiative designed to engage leading regulators, industry executives, public interest advocates, and academic experts in frank discussions about the role of individual control and notice and consent in data protection today, and alternative models for providing better protection for both information privacy and valuable data flows in the emerging world of Big Data and cloud computing.

114 citations

Journal ArticleDOI
TL;DR: The real borderlines of the 'right to explanation' in the GDPR are analyzed and a 'legibility test' is recommended that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in an automated decision-making.
Abstract: • The aim of this contribution is to analyse the real borderlines of the 'right to explanation' in the GDPR and to discretely distinguish between different levels of information and of consumers' awareness in the 'black box society'. In order to combine transparency and comprehensibility we propose the new concept of algorithm 'legibility'. • We argue that a systemic interpretation is needed in this field, since it can be beneficial not only for individuals but also for businesses. This may be an opportunity for auditing algorithms and correcting unknown machine biases, thus similarly enhancing the quality of decision-making outputs. • Accordingly, we show how a systemic interpretation of articles 13-15 and 22 GDPR is necessary, considering in particular that: the threshold of minimum human intervention required so that the decision-making is 'solely' automated (Art. 22(1)) can also include nominal human intervention; the envisaged 'significant effects' on individuals (Art. 22(1)) can encompass as well marketing manipulation, price discrimination, etc.; 'meaningful information' that should be provided to data subjects about the logic, significance and consequences of decision-making (Art. 15(1)(h)) should be read as 'legibility' of "architecture" and "implementation" of algorithmic processing; trade secret protection might limit the right of access of data subjects, but there is a general legal favour for data protection rights that should reduce the impact of trade secrets protection. • In addition, we recommend a 'legibility test' that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in an automated decision-making.

105 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202313
202222
202110
20209
201917
201824