scispace - formally typeset
Search or ask a question

Showing papers in "International Data Privacy Law in 2020"


Journal ArticleDOI
TL;DR: It is concluded that there always remains a residual risk when anonymisation is used and the concluding section links this conclusion more generally to the notion of risk in the GDPR.
Abstract: In this article, we examine the concept of non-personal data from a law and computer science perspective. The delineation between personal data and non-personal data is of paramount importance to determine the GDPR’s scope of application. This exercise is, however, fraught with difficulty, also when it comes to de-personalised data – that is to say data that once was personal data but has been manipulated with the goal of turning it into anonymous data. This article charts that the legal definition of anonymous data is subject to uncertainty. Indeed, the definitions adopted in the GDPR, by the Article 29 Working Party and by national supervisory authorities diverge significantly. Whereas the GDPR admits that there can be a remaining risk of identification even in relation to anonymous data, others have insisted that no such risk is acceptable. A review of the technical underpinnings of anonymisation that is subsequently applied to two concrete case studies involving personal data used on blockchains, we conclude that there always remains a residual risk when anonymisation is used. The concluding section links this conclusion more generally to the notion of risk in the GDPR.

84 citations



Journal ArticleDOI
TL;DR: In 2019, the General Data Protection Regulation (GDPR) was proposed as mentioned in this paper, which requires the implementation of privacy engineering and hard privacy-enhancing technologies (PETs) for data minimization.
Abstract: In its simplest formulation, data protection by design and default uses technical and organizational measures to achieve data protection goals. Although privacy regulators have endorsed privacy-enhancing technologies or PETs for well over thirty years, Article 25 of the General Data Protection Regulation (GDPR) breaks new ground by transforming this idea into a binding legal obligation. But Article 25 as presently conceived is poorly aligned with privacy engineering methods and related privacy-enhancing technologies (PETs). This is especially true of “hard” PETs that place limited trust in third parties (including data controllers) and instead rely on cryptographic techniques to achieve data minimisation. In order to advance data protection in its own right rather than merely reinforce the general principles of the GDPR, Article 25 must be interpreted as requiring the implementation of privacy engineering and hard PETs. A bold way to achieve this is by mandating that data controllers use available hard PETs for data minimisation. More gradual steps include data protection regulators insisting on a central role for privacy engineering and PETs in public sector projects; issuing guidance on Article 25 in very forceful terms that clearly require the implementation of “state of the art” privacy technology; and using their enforcement powers to reward good examples of privacy engineering rather than to penalize failures. NB: This is a pre-copyedited, preprint version of an article accepted for publication in International Data Privacy Law following peer review. The final and updated version was published in International Data Privacy Law, Volume 10, Issue 1, February 2020, Pages 37–56, https://doi.org/10.1093/idpl/ipz019.

10 citations


Journal ArticleDOI
TL;DR: The Protection of Personal Information Act (POPIA) as mentioned in this paper is the first comprehensive data protection regulation to be passed in South Africa and it gives effect to the right to informational privacy derived from the constitutional right to privacy.
Abstract: • The Protection of Personal Information Act (POPIA) [No.4 of 2013] is the first comprehensive data protection regulation to be passed in South Africa and it gives effect to the right to informational privacy derived from the constitutional right to privacy. • It is due to come into force in 2020, and seeks to regulate the processing of personal information in South Africa, regulate the flow of personal information across South Africa’s borders, and ensure that any limitations on the right to privacy are justified and aimed at protecting other important rights and interests. • Although it was not drafted with health research in mind, POPIA will have an impact on the sharing of health data for research, in particular biorepositories. • It is now timely to consider the impact of POPIA on biorepositories, and the necessary changes to their access and sharing arrangements prior to POPIA coming into force.

9 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a practical approach for a DPIA to automated decision-making to detect potential impacts on fundamental rights under the General Data Protection Regulation (GDPR) by identifying fundamental rights potentially at risk, identifying risks occurring in their ADM systems at design stages and during operation, and establishing to what extent data subjects exercise control over data processing.
Abstract: Companies expect great and promising benefits from automated decision-making with personal data; however, scientific research indicates that legal uncertainty exists among private controllers with the interpretation of provisions relevant to automated decision-making under the General Data Protection Regulation (GDPR). Article 35 GDPR obliges private controllers to execute a Data Protection Impact Assessment (DPIA) prior to deploying automated decisions on humans. Assessing potential fundamental rights impacts is part of that DPIA. The objective of this article is to provide private controllers with a practical approach for a DPIA to automated decision-making to detect potential impacts on fundamental rights. The approach indicates levels of impacts and types of measures a controller should consider to achieve an appropriate risk management. The impact assessment is based on four benchmarks: (i) to identify fundamental rights potentially at risk; (ii) to identify risks occurring in their ADM systems at design stages and during operation; (iii) to balance fundamental rights risks and controller interests involved; and (iv) to establish to what extent data subjects exercise control over data processing. By responding to the benchmarks, controllers identify risk levels that indicate the type of measures that should be considered to achieve fundamental rights compliant ADM. This approach enables controllers to give account towards data subjects and supervisory authorities about envisaged risk management to potential impacts on fundamental rights. The proposed approach seeks to foster compliant, fair, and transparent automated decision-making.

8 citations


Journal ArticleDOI
TL;DR: In this paper, the authors contrast the EU's governance of fundamental rights to privacy and data protection with external trade policy on cross-border data flows, and conclude that the process of aligning EU's normative approach to personal data protection has been, until very recently, riddled with contradictions.
Abstract: Key Points: * Global data flows underpinning cross-border digital trade have moved centre stage in international trade negotiations. New trade law disciplines on the free flow of data are included in a number of international trade deals. * The European Union (EU) has a key role to play in the global governance of the protection of personal data. The EU’s strict data protection regime has sometimes been framed as a digital trade barrier. * This article juxtaposes the EU’s governance of fundamental rights to privacy and data protection with external trade policy on cross-border data flows. * The process of aligning EU’s normative approach to personal data protection with its external trade policy has been, until very recently, riddled with contradictions. * The article concludes with an assessment of the EU’s recent horizontal strategy on cross-border data flows and personal data protection in trade and investment agreements, which aims to align EU external policy.

7 citations




Journal ArticleDOI
TL;DR: The European Data Protection Board (EDPB) as mentioned in this paper was created by the adoption of the General Data Protection Regulation (GDPR) gave birth to a new EU body superseding the former advisory Article 29 Working Party (WP29).
Abstract: The adoption of the General Data Protection Regulation (GDPR) gave birth to a new EU body - the European Data Protection Board (EDPB), superseding the former advisory Article 29 Working Party (WP29). The institutional reform was one of the key pillars of the EU data protection overhaul carried out in 2012-2016, but also one of the aspects that proved to be the most difficult to agree upon in the interinstitutional negotiations on the GDPR, in particular in the Council of Ministers internal talks. Differently from the original proposals, the EDPB was entrusted with certain legally-binding powers aligning it with some facets of the EU agencies, but the delegation of these powers was cautious and intended to be triggered only as a last resort avenue ‘in individual cases’. In the light of such dynamics, can the EDPB be viewed as a step towards the EU-level data protection agency building, especially, from the political point of view? How distant is the prospect of having there a fully-fledged EU-level data protection regulator, considering that the current ethos of the EU data protection enforcement puts an emphasis on a decentralised mode of governance with significant amount of decision-making competences remaining at the national level?

1 citations