scispace - formally typeset
Search or ask a question

Showing papers in "International Data Privacy Law in 2018"


Journal ArticleDOI
TL;DR: This article examines a range of longterm data collections conducted by researchers in social science in order to identify the characteristics of these programs that drive their unique sets of risks and benefits and examines the practices that have been established by social scientists to protect the privacy of data subjects.

54 citations


Journal ArticleDOI
TL;DR: There to be an important role for data subject empowerment tools in a hyper-complex, automated and ubiquitous data-processing ecosystem, even if only used marginally, they provide a checks and balances infrastructure overseeing controllers' processing operations, both on an individual basis as well as collectively.
Abstract: The right of access occupies a central role in EU data protection law's arsenal of data subject empowerment measures. It can be seen as a necessary enabler for most other data subject rights as well as an important role in monitoring operations and (en)forcing compliance. Despite some high-profile revelations regarding unsavoury data processing practices over the past few years, access rights still appear to be underused and not properly accommodated. It is especially this last hypothesis we tried to investigate and substantiate through a legal empirical study. During the first half of 2017, around sixty information society service providers were contacted with data subject access requests. Eventually, the study confirmed the general suspicion that access rights are by and large not adequately accommodated. The systematic approach did allow for a more granular identification of key issues and broader problematic trends. Notably, it uncovered an often-flagrant lack of awareness; organisation; motivation; and harmonisation. Despite the poor results of the empirical study, we still believe there to be an important role for data subject empowerment tools in a hyper-complex, automated and ubiquitous data-processing ecosystem. Even if only used marginally, they provide a checks and balances infrastructure overseeing controllers' processing operations, both on an individual basis as well as collectively. The empirical findings also allow identifying concrete suggestions aimed at controllers, such as relatively easy fixes in privacy policies and access rights templates.

45 citations


Journal ArticleDOI
TL;DR: It is shown that some DPbD strategies deployed by large data controllers result in personal data which, despite remaining clearly reidentifiable by a capable adversary, make it difficult for the controller to grant data subjects rights over for the purposes of managing this risk.
Abstract: • Data Protection by Design (DPbD), a holistic approach to embedding principles in technical and organisational measures undertaken by data controllers, building on the notion of Privacy by Design, is now a qualified duty in the GDPR. • Practitioners have seen DPbD less holistically, instead framing it through the confidentiality-focussed lens of Privacy Enhancing Technologies (PETs). • While focussing primarily on confidentiality risk, we show that some DPbD strategies deployed by large data controllers result in personal data which, despite remaining clearly reidentifiable by a capable adversary, make it difficult for the controller to grant data subjects rights (eg access, erasure, objection) over for the purposes of managing this risk. • Informed by case studies of Apple’s Siri voice assistant and Transport for London’s Wi-Fi analytics, we suggest three main ways to make deployed DPbD more accountable and data subject–centric: building parallel systems to fulfil rights, including dealing with volunteered data; making inevitable trade-offs more explicit and transparent through Data Protection Impact Assessments; and through ex ante and ex post information rights (arts 13–15), which we argue may require the provision of information concerning DPbD trade-offs. • Despite steep technical hurdles, we call both for researchers in PETs to develop rigorous techniques to balance privacy-as-control with privacyas-confidentiality, and for DPAs to consider tailoring guidance and future frameworks to better oversee the trade-offs being made by primarily wellintentioned data controllers employing DPbD.

41 citations


Journal ArticleDOI
TL;DR: In this paper, the authors identify five fears that the rise of mass predictive personalisation may portend for collective values and commitments, which can ultimately be attributed to the systematic use of digital profiling techniques that apply machine learning algorithms to merged sets of data collected from the digital traces generated from continuously tracking users' on-line behaviour to make calculated predictions about individuals across a population.
Abstract: The starting point for this article begins from the observation that that data-driven service delivery is catalysing a change in modes of production and consumption, marked by a move away from ‘mass production’ in favour of ‘mass predictive personalisation’. Despite the portrayal of personalised as ‘empowering’ consumers, I identify five fears that the rise of mass predictive personalisation may portend for collective values and commitments. The first three fears are largely concerned the values of fairness and justice, and which can ultimately be attributed to the systematic use of digital profiling techniques that apply machine learning algorithms to merged sets of data collected from the digital traces generated from continuously tracking users’ on-line behaviour to make calculated predictions about individuals across a population. The remaining two fears coalesce around concerns for social solidarity and loss of community that may be associated with the increasing personalisation of services and offerings, which is both fuelling and being fuelled by, an increasingly narcissistic mindset that mass personalisation makes possible. In so doing, my aim is to provoke critical discussion and reflection that will motivate more penetrating research and place questions of this kind more firmly onto the academic, policy, public and political agenda.

28 citations


Journal ArticleDOI
TL;DR: Tensions occur where competition enforcement favours sharing or merging of datasets for economic efficiency reasons against the spirit of the data protection rules, but there is also room for synergies by involving data protection or consumer authorities in merger investigations and by considering data Protection or consumer interests more proactively in the design of merger remedies.
Abstract: • Since the notion of fairness underpins the regimes of competition, data protection and consumer law, it can act as a connecting factor to align substantive protections and enforcement mechanisms in the three fields. • While most attention has so far been devoted to how vigorous competition enforcement can render data protection rules more effective, the complementarity between the regimes also works the other way around. • In particular, substantive data protection or consumer law principles can be integrated into competition analysis so as to strengthen the ability of competition authorities to tackle new forms of commercial conduct. • At the same time, the competition concepts of market definition and market power can help to interpret the scale of obligations that data controllers and processors have to comply with under data protection law in line with the risk-based approach of the General Data Protection Regulation. • Tensions occur where competition enforcement favours sharing or merging of datasets for economic efficiency reasons against the spirit of the data protection rules, but there is also room for synergies by involving data protection or consumer authorities in merger investigations and by considering data protection or consumer interests more proactively in the design of merger remedies.

15 citations








Journal ArticleDOI
TL;DR: In this article, the authors proposed that the legal regime of the dual EU data protection regime may seriously undermine the legitimacy of public-private partnerships, unless private parties are given status of competent authorities or controllership within PPPs, and that the legislative measures creating such exemptions subject private-public data transfers to the same conditions of legality of processing as the processing by competent authorities.
Abstract: Legitimacy of public-private partnerships for combatting cybercrime partially depends on whether or not law enforcement data processing activities are subject to the same data protection-related restrictions, whether they involve cooperation of private parties or not. Information sharing within PPPs is a complex phenomenon with various configurations and power structures. This complexity needs to be accounted for in the analysis of the applicability of the two data protection regimes. GDPR as a general data protection instrument and the Police Directive as a lex specialis are meant to leave no space for the private-public data transfers to fall through the cracks. However, which legal regime applies when private entities and law enforcement act as joint controllers is a grey area of the dual EU data protection regime and may seriously undermine legitimacy of PPPs, unless private parties are given status of competent authorities or controllership within PPPs is assigned in a special legal act. Private parties may be subject to less data protection restrictions, e.g. exempted from the purpose limitation principle, when collaborating with the law enforcement. This may create motivation for the public law enforcement to actively seek such collaboration to avoid constraints imposed on them by law. It is recommended that the legislative measures creating such exemptions subject private-public data transfers to the same conditions of legality of processing as the processing by competent authorities.

Journal ArticleDOI
TL;DR: Arguments show that antitrust has an important, but narrow role in privacy protection, and that antitrust can be the right tool to fight anti-competitive competitive effects.
Abstract: In the 21st century, it has become virtually impossible to meaningfully participate in society without revealing our personal data. Many of the most necessary, entertaining, and useful internet services demand personal data that are then used for targeted advertisements as a condition of use. Service providers follow us around the Internet and across devices to show us ads and to collect more data. Credit rating agencies and financial institutions determine our access to mortgages and car loans based on the data they relentlessly collect from as many sources as possible. And even supermarkets indefinitely collect and store our payment and delivery information and our shopping history. Many consumers are unsatisfied with this state of affairs. Some find it abusive that their privacy is the price to pay for access to socially or economically unavoidable internet platforms. Others hate to be paying twice for their internet service, both with their money and with their personal information. And all are outraged by data breaches, hacks, revelations of corporate and state surveillance, and other social and political scandals. Consumers in the USA, the European Union (EU), and elsewhere want more control over their personal data, and they demand privacy protection. Many solutions have been put forward to defend consumers’ privacy. The proponents of antitrust as a privacy remedy provide a variety of rationales, often interrelated, which this paper will briefly explore. One is that dominant platforms can impose abusive terms over their users who then have no way of leaving the service because network effects leave them effectively locked in. Another rationale is that antitrust promotes consumer welfare by ensuring consumer choice and that antitrust enforcement should guarantee that there is non-price competition including in different levels of privacy protection. A third one suggests that companies should be held accountable under antitrust law when they mislead or deceive consumers about the personal data collection practises that helped them achieve monopoly power. And finally, there are those who believe that the possession of personal data should be viewed as a potential barrier to future competition and considered during merger review, even when a merger wouldn’t otherwise have significant vertical or horizontal competitive effects. The above arguments show that antitrust has an important, but narrow role in privacy protection. We agree that antitrust should encourage non-price competition, including different levels of privacy protection, and that antitrust can be the right tool to fight anti-competitive Key Points

Journal ArticleDOI
TL;DR: In this article, the authors examine how market power in the underlying services that generate data impacts competition in data privacy and whether the proxies for assessing market power, in these underlying services, cater to data privacy interests.
Abstract: Firms compete by offering consumers lower prices but also high-quality products, and a wide range of choices. With the increasing commercialization of personal data, there is a growing consensus that the level of privacy protection and deployment of Privacy Enhancing Technologies (PETs) could be subject to competition, as an element of quality, choice or innovation. A case in point is the recognition by the European Commission that data privacy constitutes a key parameter of non-price (quality) competition in markets for consumer communications and professional social networks. This development signifies that market power may be exerted by reducing the level of data privacy and foreclosing competition on PETs deployment. Despite this, how market power affects competition on privacy and PETs remains unclear. This is partially because microeconomic theory offers little help in predicting how market power or lack thereof affects quality (including choice and innovation). The aim of this article is to examine how market power in the underlying services that generate data impacts competition in data privacy and whether the proxies for assessing market power in these underlying services cater to data privacy interests. To this end, first, the article begins by highlighting some emerging but inconclusive literature shedding some light on the link between market structure and competition in data privacy. Secondly, the article identifies and discusses the structural and behavioural considerations that might hinder effective competition through data privacy and PETs. Finally, it examines the role that competition law can play in promoting and maintaining such competition.



Journal ArticleDOI
TL;DR: In this paper, the authors propose a strategy combining precautionary measures, public discourse, and enforcement until the risks are more completely understood, using insights from complex systems science to better understand these risks.
Abstract: The GDPR poses special requirements for the processing of sensitive data, but it is not clear whether these requirements are sufficient to prevent the risk associated with this processing because this risk is not clearly defined. Furthermore, the GDPR’s clauses on the processing of—and profiling based on—sensitive data do not sufficiently account for the fact that individual data subjects are parts of complex systems, whose emergent properties betray sensitive traits from non-sensitive data. The algorithms used to process big data are largely opaque to both controllers and data subjects: if the output of an algorithm has discriminatory effects coinciding with sensitive traits because the algorithm accidentally discerns an emergent property, this may remain unnoticed. At the moment, there are no remedies that can prevent the discovery of sensitive traits from non-sensitive data. Managing the risks resulting from processing data that can reveal sensitive traits requires a strategy combining precautionary measures, public discourse, and enforcement until the risks are more completely understood. Insights from complex systems science are likely to be useful in better understanding these risks.






Journal ArticleDOI
TL;DR: An argument is made that it would be best for the competition authorities and information regulator to enter into a formal cooperation agreement in order to ensure that the potential anti-competitive uses of data are best regulated while avoiding an unnecessary layer of additional regulation for data processors.
Abstract: A brief overview of the respective frameworks for competition and data protection law in South Africa is provided before providing examples of where convergence between the two occurs. An argument is made that it would be best for the competition authorities and information regulator to enter into a formal cooperation agreement in order to best manage this in order to ensure that the potential anti-competitive uses of data are best regulated while avoiding an unnecessary layer of additional regulation for data processors.