scispace - formally typeset
Search or ask a question

Showing papers in "European Data Protection Law Review in 2022"


Journal ArticleDOI
TL;DR: In this article , the authors analyze the scope of data covered by Art. 20 GDPR, the conditions of its execution, and its practicality with respect to the transaction costs involved.
Abstract: Restrictions of data access for complementary services in digital (IoT) ecosystems are an increasing concern in the legal and economic discussion around the interface between competition law and data protection. The connected car and its ecosystem of innovative complementary products and services is exemplary for this problem. Car manufacturers (OEMs) enjoy exclusive control over most in-vehicle data and thus a gatekeeper position that allows them to control complementary markets. One of a number of potential solutions to this problem is the application of the right to data portability of the General Data Protection Regulation (GDPR). This paper shows the difficulties of solving this data access problem through Art. 20 GDPR. In particular, we analyze the scope of data covered by Art. 20 GDPR, the conditions of its execution, and its practicality with respect to the transaction costs involved. Our findings suggest that Art. 20 GDPR is insufficient to solve the data access problem in the ecosystem of connected cars. Key Words: Data Portability | Data Access | Data Protection Law | Competition Policy | Connected Cars | PSD2 | Consumer Data Rights

2 citations


Journal ArticleDOI
TL;DR: In this paper , the authors propose a reflective exercise in which they look at the concept of accountability and how it was introduced in the GDPR and find a systematic and process-oriented accountability present in the regulation.
Abstract: Transparency has been at the centre of the debate on algorithmic governance. However, when the GDPR was adopted in 2016, the legislator preferred to establish accountability as the core of the Regulation's principles, rather than transparency. Unfortunately, accountability does not yet seem to be playing the role it was assigned in the data protection ecosystem, at least when it comes to algorithmic decision-making. To turn this scenario around, we propose a reflective exercise in which we look at the concept of accountability and how it was introduced in the GDPR. By emphasising on the human element in algorithmic decision-making, we find a systematic and process-oriented accountability present in the GDPR. Following arguments already made in the literature, we hold that this kind of accountability is well suited for algorithmic governance. Moreover, we argue that it could be strengthened by the Commission's proposal for a Regulation on Artificial Intelligence. Keywords: Accountability | Transparency | GDPR | Algorithmic Decision-Making | Artificial Intelligence

2 citations







Journal ArticleDOI
TL;DR: In this paper , the authors investigated whether apps popular among children have implemented adequate methods to meet age verification obligations by analyzing the registration process of these apps and found that the level of assurance of age was too low or the method not privacy-friendly or inclusive.
Abstract: The GDPR provides a high level of protection for children's personal data. This protection is reflected in various recitals and provisions in the GDPR and results in two practical challenges. First, the (implicit) need for age verification, or at least determining whether someone is not a child. This obligation would apply as soon as personal data are processed and there is no evidence to the contrary that the personal data of children are processed. Secondly, in the more specific case that consent is one of the lawful grounds, there is a need to verify that consent has been given by a parent (or guardian) when a data subject has not reached the age of digital consent. In our research, we studied whether apps popular among children have implemented adequate methods to meet both verification obligations by analysing the registration process of these apps. No apps that we investigated met the requirements of the GDPR at the time of our research. The level of assurance of age was too low or the method not privacy-friendly or inclusive. Parental consent mechanisms were generally lacking - while consent was a lawful ground in all apps we investigated. If a parent was involved in the registration process, the actual verification of that status was mostly missing. Keywords: Data Protection | Age Verification | Parental Consent


Journal ArticleDOI
TL;DR: In this article , a risk-based approach is suggested in which different degrees of re-identifiability are recognized, as opposed as a too simplistic binary approach to the applicability of the Regulation.
Abstract: In the EU processing of personal data is subject to strict regulations that are laid out in the GDPR. The GDPR definition of personal data is: any information relating to an identified or identifiable natural person. Hence, if the definition applies, that data falls under the realm of the GDPR. If data is anonymised however, and as such does not constitute ‘personal data’, GDPR regulations are not applicable. This makes the definition of personal data very important. Or better yet, determining when, according to the GDPR, data should be seen as personal data. This approach of the GDPR, where data is either in scope of the regulation or not, can be considered a binary or ‘black and white’ approach to the applicability of the Regulation. This article describes the advances in technology of the big data era and how these advances make it impossible to adequately make the distinction between data and personal data. It focusses on the definition of personal data and discusses the shortcomings of the binary approach. Specific attention is paid to recital 26 GDPR, which provides some guidance to determine whether a natural person is identifiable. The difficulty of the binary approach in the definition of personal data is presented against the background of the technological developments of the last decades and in the years to come (the big data era), as these development challenge the adequate demarcation of personal data even further and demand for a better approach. The article concludes with reflections on what will be a sustainable approach for the demarcation of the scope of data protection and as such a maintainable and preferable basis for privacy protection in the future. A risk based approach is suggested in which different degrees of re-identifiability are recognized, as opposed as a too simplistic binary approach. Keywords: Anonymisation | Big Data | Data Elimination | Definition | GDPR | Personal Data


Journal ArticleDOI
TL;DR: In this article , ECLI:EU:C:2020:791; Case C-623/17 Privacy International [2020] ECLi:EU :C: 2020:790; case C-512/18 and C-520/18 La Quadrature du Net and others
Abstract: Joined Cases C-511/18, C-512/18 and C-520/18 La Quadrature du Net and others [2020] ECLI:EU:C:2020:791; Case C-623/17 Privacy International [2020] ECLI:EU:C:2020:790.

Journal ArticleDOI
TL;DR: Intra-EU, this means that codes can not only specify the abstract principles of the GDPR for specific sectors, but that they can ‘help to bridge the harmonisation gaps that may exist between Member States in their application of data protection law’.
Abstract: Codes of conduct remain an underlit aspect of the EU’s General Data Protection Regulation (GDPR), which can be found in articles 40 and 41. Member States, supervisory authorities, the Board and the Commission shall encourage the drawing up of such codes ‘intended to contribute to the proper application of this Regulation’ (article 40(1) GDPR). In essence they shall ‘give operational meaning to the principles of data protection’, i.e. help to translate the abstract provisions of the GDPR into concrete obligations and procedures. The GDPR designs a system consisting of three types of codes, eachwith adifferent territorial reach. Article 40(5) and (6) describe ‘national codes of conduct’, which the European Data Protection Board (EDPB) hasmore clearly defined as ‘a codewhich covers processing activities contained in one Member State’. Article 40(7) sets out the possibility of a transnational code,which is ‘a codewhich covers processing activities in more than one Member State’. Finally, article 40(9) speaks of ‘codes having general validity’. Although this type of code is not further defined in the GDPR nor in guidance issued by the EDPB, it is implied that a code of this type can cover processing activities in all EU Member States – going one step further than transnational codes, which may apply in several but not necessarily all Member States. Intra-EU, this means that codes can not only specify the abstract principles of the GDPR for specific sectors, but that they can ‘help to bridge the harmonisation gaps that may exist between Member States in their application of data protection law’. However, codes have an important extra-EU function. The GDPR’s Chapter V sets out the rules for international data transfers, i.e. the transfer of personal data from the Union to recipients in third countries or to international organisations. These data flows have a dedicated chapter in the GDPR since the European legislator wants to safeguard that the level of protection of natural persons ensured in the EU is not undermined when the data leaves the EU’s territory. Chapter V of the GDPR is particularly notable because it constructs what Kuner describes as a threetiered structure. Adequacy decisions are placed at the top of the hierarchy, appropriate safeguards – including the use of codes of conduct – in the middle, and

Journal ArticleDOI
TL;DR: Nomadic Pastoralism among the Mongol Herders: Multispecies and Spatial Ethnography in Mongolia and Transbaikalia Charlotte Marchina Amsterdam: Amsterdam University Press, 2021, 178pp., 34 figures (incl. maps), 14 photographs (B+W) as discussed by the authors .
Abstract: Nomadic Pastoralism among the Mongol Herders: Multispecies and Spatial Ethnography in Mongolia and Transbaikalia Charlotte Marchina Amsterdam: Amsterdam University Press, 2021, 178pp., 34 figures (incl. maps), 14 photographs (B+W). ($101.03 / €89). ISBN: 978-94-6372-142-4.



Journal ArticleDOI
TL;DR: In this paper , a systematic perspective that connects Article 22 GDPR to the future European Artificial Intelligence Act and a thorough teleological interpretation of the GDPR is presented. But with a few notable exceptions, comprehensive analyses of all relevant arguments are still missing.
Abstract: Article 22 of the GDPR can either be understood as a prohibition or as a data subject right. How one decides has far-reaching consequences both for the persons concerned and for the companies or public authorities that use such decision-making systems. Scholars have presented different arguments in favour of one or the other interpretation. However, with a few notable exceptions, comprehensive analyses of all relevant arguments are still missing. This paper contributes to fill this gap and attempts to complete the picture by adding two novel and important aspects: First, a systematic perspective that connects Article 22 GDPR to the future European Artificial Intelligence Act and second, a thorough teleological interpretation of Article 22 GDPR. Both these aspects tip the balance in favour of a data subject right.






Journal ArticleDOI





Journal ArticleDOI
TL;DR: In this paper , the authors highlight how the Data Governance Act states its measures are designed to fully respect the GDPR as starting point, but when examining the notion of consent, true GDPR compliance may be an unobtainable goal or at least an unscalable one in some contexts of data exchange relationships.
Abstract: The European Union strives to keep its Data Economy competitive and fit for the future. The proclamation of data as ‘new oil’ requires the envisioning of new ways to make this ‘oil’ available to data-driven industries. The recently adopted Data Governance Act (DGA) is a tool that increases the possibility of data-flows towards data driven industries, while simultaneously promising to maintain uncompromised data protection standards for individuals. The DGA sets the legislative framework for Data Sharing Services or Data Intermediaries. These services stand in between data subjects and data users, and serve as actor that make demand- and supply sides of data meet. When handling personal data, the Data Governance Act pivots on several notions from the GDPR, for instance that of consent. In doing so, it becomes questionable whether or not the notion of consent functions, in the DGA, in the manner as it was envisioned to function in the GDPR. A strict reading of the notion of consent makes its application in the structure of the Data Governance Act difficult to image for reasons explored in this paper. Most pressing are the elements that make up the notion of consent. Those elements being that consent should be specific, freely given and informed. These three elements are put under strain in the DGA’s multi-party, data-pool, or data exchange relationships. This paper highlights how the Data Governance Act states its measures are designed to ‘fully’ respect the GDPR as starting point. However, when examining the notion of consent, true GDPR compliance may be an unobtainable goal or at least an unscalable one in some contexts of the Data Intermediary Services. Keywords: Consent | Data Governance Act | Data Intermediaries | Data Pooling | Data Holders | Data Users | European Strategy for Data

Journal ArticleDOI
TL;DR: In this article , the authors discuss privacy laws in on-line contracts from the Shari'ah perspective and explore the extent of the sanctity given to the right to privacy in Islamic law.
Abstract: The aim of the article is to discuss privacy laws in on-line contracts from the Shari’ah perspective. While much has been written on this subject from the modern perspective, few scholars have applied the Shari’ah lens. Privacy is the very soul of being human. It is not only about keeping personal information confidential but also about creating a trusted framework for collecting, exchanging, and using personal data in online contracts. Islamic law, on the basis of the holy Qur’an and Sunnah, gives great significance to the right of privacy. This fundamental human right is also recognized by modern western jurisdictions. The right to privacy and data protection for e-consumers in online contracts is crucial for the functioning of online commerce. This paper attempts to explore the extent of the sanctity given to the right to privacy in Islamic law. Although the paper is essentially about privacy law in Islam, reference is also made to European law on the same subject. The paper will show that in this regard, Islamic principles are not at odds with modern principles of privacy. Key words: Data Protection | E-consumers Right | Privacy | Islamic and European Law | Online Contract