Data Subject Rights under the GDPR
With a Commentary through the Lens of the Data-driven Economy
HELENA U. VRABEC
3
Great Clarendon Street, Oxford, OX2 6DP, United Kingdom
Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and in certain other countries
© Helena U. Vrabec 2021
The moral rights of the author have been asserted
First Edition published in 2021
Impression: 1
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by licence or under terms agreed with the appropriate reprographics rights organization. Enquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above
You must not circulate this work in any other form and you must impose this same condition on any acquirer
Crown copyright material is reproduced under Class Licence Number C01P0000148 with the permission of OPSI and the Queen’s Printer for Scotland
Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America
British Library Cataloguing in Publication Data
Data available
Library of Congress Control Number: 2020951503
ISBN 978–0–19–886842–2
DOI: 10.1093/oso/9780198868422.001.0001
Printed and bound by CPI Group (UK) Ltd, Croydon, CR0 4YY
Links to third party websites are provided by Oxford in good faith and for information only. Oxford disclaims any responsibility for the materials contained in any third party website referenced in this work.
To my son Albert
Court of Justice of the EU
Table of Cases JUDGMENTS
Case C-201/14 Bara and Others [2015] ECLI:EU:C:2015:638 . . . . . . . . . . . . . . 64, 67n.18, 95
Case C-364/01 Barbier [2000] ECLI:EU:C:2003:665 . . .
. 22n.46
Case C-28/08 Bavarian Lager [2010] ECLI:EU:C:2010:378 53, 104n.1
Case C-582/14 Breyer [2016] ECLI:EU:C:2016:779 25n.60, 31, 34n.114
Case C-398/15 Camera di Commercio, Industria, Artigianato e Agricoltura di Lecce v Salvatore Manni [2017] ECLI:EU:C:2017:197 .
.139–41
Joined cases Case C-293/12 and C-594/12 Digital Rights Ireland and Others [2014] EU:C:2014:238 21n.33, 119–20
Case C-468/10 ASNEF [2011] ECLI:EU:C:2011:777 22n.44, 23
Case C-40/17 Fashion ID [2019] ECLI:EU:C:2019:629 32–33
Case C-6/64 Flaminio Costa v E.N.E.L. [1964] ECLI:EU:C:1964:66 .
Case C-136/17 GC and Others v Commission nationale de l’informatique et des libertés [2019] ECLI:EU:C:2019:773 . .
12n.69
139, 154n.141
Case C-507/17 Google v Commission nationale de l’informatique et des libertés [2019] EU:C:2019:772 138
Case C-131/12 Google Spain [2014] ECLI:EU:C:2014:317 18–19, 25n.61, 116–17, 125n.109, 132–33, 134–39, 142, 146, 149, 156–57
Case C-473/12 IPI v Geoffrey Engelbert [2013] ECLI:EU:C:2013:715 . . . . . . 92n.154, 95–96
Case C-555/07 Kucukdeveci [2010] ECLI:EU:C:2010:21 . .
Case C-101/01 Lindqvist [2003] ECLI:EU:C:2003:596 116
Joined cases C-465/00, C-138/01 and C-139/01 Österreichischer Rundfunk and Others [2003] EU:C:2003:294 53
Case C-673/17 Planet49 GmbH [2019] ECLI:EU:C:2019:801 . . .
Case C-275/06 Promusicae [2008] ECLI:EU:C:2008:54 .
Case C-13/16 Rigas [2017] ECLI:EU:C:2017:336 .
100
25n.59
Case C-553/07 Rijkeboer [2009] ECLI:EU:C:2009:293 104, 117
Case C-36/75 Roland Rutili v Minister for the Interior [1975]
ECLI:EU:C:1975:137
Case C-73/07 Satakunnan Markkinapörssi and Satamedia [2008]
ECLI:EU:C:2008:727
Case C-70/10 Scarlet v Sabam [2016] ECLI:EU:C:2011:771
53
31n.94
Case C-362/14 Schrems [2015] ECLI:EU:C:2015:650 119n.77
Case C 203/15 and C 698/15 Tele2 Sverige and Watson and Others [2016]
ECLI:EU:C:2016:970
Joined cases C-92/09 and C-93/09 Volker und Markus Schecke GbR, Hartmut Eifert v Land Hessen [2010] ECLI:EU:C:2010:662
119n.77
22n.44, 23n.51, 94n.162
Case C-230/14 Weltimmo [2015] ECLI:EU:C:2015:426 58n.68
Case C-210/16 Wirtschaftsakademie Schleswig-Holstein [2018]
ECLI:EU:C:2018:57
58n.68
Opinion 1/15 of the Court regarding Draft agreement between Canada and the European Union on the transfer and processing of Passenger Name Record data [2017] ECLI:EU:C:2017:592 72n.42
Case T-194/04 Bavarian Lager v Commission [2007] ECLI:EU:C:2010:378 (General Court) 26n.67
Commission Decisions
Google/Double Click (Case COMP/M.4731) Commission Decision 927 [2008] OJ C184/10 .
TomTom/TeleAtlas (Case COMP/M.4854) Commission Decision 1859 [2008] OJ C237/8 43–44
European Court of Human Rights
Barbulescu v Romania App no 61496/08 (ECtHR, 12 January 2016) 20n.31
Benedik v Slovenia App no 62357/14 (ECtHR, 24 April 2018)
Koch v Germany App no 497/09 (ECtHR, 19 July 2012) .
Magyar Helsinki Bizottsag v Hungary App no 18030/11 (ECtHR, 8 November 2016) 21–22, 26–27
Mitev v Bulgaria App no 42758/07 (ECtHR, 29 June 2010) 184n.156
MP and Others v Bulgaria App no 22457/08 (ECtHR, 26 July 2011) 184n.156
P and S v Poland App no 57375/08 (ECtHR, 30 October 2012)
Pretty v United Kingdom App no 2346/02 (ECtHR, 29 April 2002)
S and Marper v United Kingdom App nos 30562/04 and 30562/04 (ECtHR, 4 December 2008) 18n.12
Sanles v Spain App no 48335/99 (ECtHR, 26 October 2000) 184n.156
Surikov v Ukraine App no 42788/06 (ECtHR, 26 January 2017) 18n.16
Thévenon v France App no 2476/02 (ECtHR, 28 February 2006)
National courts
United States
Roe v Wade 410 US 113 (1973)
United States v Jones 564 US 400 (2012)
The Netherlands
Judgment of the Dutch Supreme Court, 8 September 2017, ECLI:NL:PHR:2017:933 .
Judgment of the Amsterdam District Court, 19 July 2018, ECLI:NL:RBAMS:2018:8606
Judgment of the Hague District Court, 11 February 2020, ECLI:NL:RBDHA:2020:1013 206–7
The Decision of the Dutch Supreme Court, 24 February 2017, ECLI:NL:HR:2017:288 . . .
The Decision of the Amsterdam District Court, 25 September 2019, ECLI:NL:RBAMS:2019:8329
Belgium
195n.29
203–4
Willem Debeuckelaere v Facebook Ireland Ltd., Facebook Inc. and Facebook Belgium Bvba., Dutch-language Brussels Court of First Instance, judgment of 16 February 2018 223n.53
The United Kingdom
Damer & Ors v Taylor Wessing LLP [2017] EWCA Civ 74 .
Dawson-Damer & Ors v Taylor Wessing LLP [2019] EWHC 1258 (Ch) . .
114
. . . . 114, 115
Open Rights Group & Anor, R (On the Application of) v Secretary of State for the Home Department & Anor [2019] EWHC 2562 (Admin) 119n.76
Germany
Bundesverfassungsgericht, judgment of 15 December 1983, BverfGE 65, 1 25–26, 50–51
Cologne Higher State Court, judgment of 14 November 2019, ECLI:DE:OLGK:2019:1114.15U126.19.00
Table of Legislation
LAWS
European instruments
Charter of Fundamental Rights of the European Union [2010] OJ C83/389 16–17, 18–19, 21, 22, 23–27, 28, 50, 51–52, 53, 96, 113, 116–17, 136–37, 140, 170–71
Commission Regulation (EU) No 330/2010 of 20 April 2010 on the application of Article 101(3) of the Treaty on the Functioning of the European Union to categories of vertical agreements and concerted practices [2010] OJ L102
42–43n.159
Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108) (open for signature on 28 January 1981, entered into force on 1 October 1985)
Convention for the Protection of Human Rights and Fundamental Freedoms
21
11n.67
Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts [1993] OJ L95 232
Council Regulation No 139/2004 of 20 January 2004 on the control of concentrations between undertakings [2004] OJ L24
42–43n.159
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 [1995] L281/31 . . . . . 9–10, 64, 68, 69, 71n.39, 73, 76, 80, 83–84, 84n.111, 95–96, 100, 61, 105, 107, 113, 123–24, 132–34, 136, 148, 149, 205–6, 208, 213–14
Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases [1996] OJ L77/20
55n.52
Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector [2002] OJ L201 . . . . . . . . 38–40, 96–98, 100–1, 204
Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/ 2004 of the European Parliament and of the Council [2005] L149/22 45, 232–33
Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on Consumer Rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and Repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council [2011] L304/64 . . . . 45, 46, 233
Directive 2013/37/EU of the European Parliament and of the Council of 26 June 2013 amending Directive 2003/98/EC on the re-use of public sector information (PSI Directive) [2013] OJ L175 .
. 164
Directive 2015/2366/EU of the European Parliament and of the Council of 25 November 2015 on payment services in the internal market, amending Directives 2002/65/EC, 2009/110/EC and 2013/36/EU and Regulation (EU) No 1093/2010, and repealing Directive 2007/64/EC [2015] OJ L337/35 .
Directive 2016/1148 of the European Parliament and of the Council of 6 July 2016 concerning measures for a high common level of security of network and information systems across the Union [2016] OJ L194/1
178
Regulation (EU) 2018/1807 of the European Parliament and of the Council of 14 November 2018 on a framework for the free flow of non-personal data in the European Union [2018] OJ L303 177–78
Treaty of Lisbon amending the Treaty on European Union and the Treaty establishing the European Community [2007] OJ C306/1 . . . . . 22
Treaty on European Union [1992] OJ C191 29n.83
Germany
Federal Data Protection Act of 30 June 2017 (Federal Law Gazette I, 2097), as last amended by Article 12 of the Act of 20 November 2019 (Federal Law Gazette I, 1626) . . . . . . . . . . . 120n.80
Slovenia
41–42
Directive (EU) 2019/770 of the European Parliament and of the Council of 20 May 2019 on certain aspects concerning contracts for the supply of digital content and digital services [2019] OJ L136/1 46, 233
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1 9–10, 29–38, 42, 48, 59–62, 64, 66–96, 105, 106–8, 109, 111–12, 113–15, 121–22, 130, 133, 141–50, 159–60, 163–71, 174, 176, 179–80, 197–212, 213–14, 220, 225, 230–35
Criminal Code, Official gazette of the Republic of Slovenia No 91/20 132n.19
Data Protection Act, Official gazette of the Republic of Slovenia No 86/04 with further amendments 132, 136n.43
United Kingdom
Data Protection Act 2018 .
United States
California Assembly Bill No 375 . . . . . . .
International instruments
115, 119
187–88
Vienna Declaration and Programme of Action Adopted by the World Conference on Human Rights in Vienna on 25 June 1993 . . . . 17–18n.11
Council of Europe, Protocol 1 to the Convention for the Protection of Human Rights and Fundamental Freedoms .
54
List of Abbreviations
AI artificial intelligence
API application programming interface
B2C business to consumer
CA Cambridge Analytica
CJEU Court of Justice of the European Union
CMA Competition and Markets Authority (UK)
CNIL Commission nationale de l’informatique et des libertés
DCD Directive on Digital Content
DPA Data Protection Authority
DPaaS data portability as a service
DPD Data Protection Directive
DPO data protection officer
EBF European Banking Federation
EC European Commission
ECHR European Convention on Human Rights
ECtHR European Court of Human Rights
EDPB European Data Protection Board
EDPS European Data Protection Supervisor
EP European Parliament
EU European Union
FRA Agency for Fundamental Rights (EU)
FTC Federal Trade Commission
GDPR General Data Protection Regulation
ICO Information Commissioner’s Office
ICT information and communication technology
IoT Internet of Things
IP intellectual property
ISO International Standardization Organisation
IT information technology
LIBE Committee on Civil Liberties, Justice and Home Affairs
ML machine learning
MP Member of Parliament
NCC National Consumer Council (Norway)
NHS National Health Service
NIS network and information security
NSA National Security Agency (US)
OECD Organisation for Economic Cooperation and Development
PbD privacy by design
PIA privacy impact assessment
xx List of Abbreviations
RFID radio-frequency ID
RTBF right to be forgotten
RWD real-world data
SME small and/or medium enterprise
TFEU Treaty on the Functioning of the European Union
UK United Kingdom
URL uniform (web) resource locator
US United States of America
1 Introduction
1.1 The lack of individual control in the data-driven economy
Since the late 20th century, the tremendous growth in the amount of information and the means by which it can be disseminated has been driving a transition from industry-based to information-based economies.1 The transformation of data into a highly valuable asset2 has had profound consequences.3 This transformation has affected how businesses value the data they hold and whom they allow to access it.4 It has enabled—and even forced—companies to change their business models.5 It has altered how organisations think about data and how they use it.6 All of the largest internet companies—Google, Facebook, Amazon, eBay, Microsoft—treat data as a major asset and source of value creation. In addition to tech giants, the bigdata revolution also offers room for the expansion of start-ups, small and mediumsized enterprises (SMEs), and large, traditional corporations, especially those that deploy highly specialised analytic software able to scrutinise data in real time.7 Bigdata sharing, selling, and licensing are often considered the great business opportunity of this era.8
The clearest evidence of the data outburst can be seen in daily life. Instant messaging using mobile phones, easy access to documents through the cloud service, and personalised advertisements are all developments based on widespread data availability and reusability.
1 Mark J Davison, The Legal Protection of Databases (Cambridge University Press 2008). For instance, the genetic sequencing data stored at the European Bioinformatics Institute has exploded and has doubled almost every year. However, these hundreds of petabytes of data represent just 10% of the tremendous amount of data stored at the CERN Swiss particle-physics laboratory. Vivien Marx, ‘The Big Challenges of Big Data’ (2013) 498 Nature 255, 255.
2 Davison (n 1) 52.
3 See e.g. OECD, ‘Data-driven Innovation: Big Data for Growth and Well-being’ (2015).
4 Viktor Mayer-Schönberger and Kenneth Cukier, Big Data: A Revolution That Will Transform How We Live, Work, and Think (Mariner Books 2014) 99.
5 ibid.
6 ibid.
7 OECD, ‘Exploring Data-driven Innovation as a New Source of Growth: Mapping the Policy Issues Raised by “Big Data” ’ (2013) OECD Digital Economy Papers No 222 www.oecd-ilibrary.org/exploringdata-driven-innovation-as-a-new-source-of-growth_5k47zw3fcp43.pdf?itemId=%2Fcontent%2Fpap er%2F5k47zw3fcp43-en&mimeType=pdf accessed 24 August 2020.
8 OECD, ‘Data-driven Innovation: Big Data for Growth and Well-being’ (n 3) 76.
In the literature, these advances have sometimes been described as the bigdata revolution. The fundamental change is reflected in two recently coined terms: data-driven9 and big data. 10 These terms convey two common trends. The first trend is the existence of an extraordinarily large amount of available data. This data is too big (volume), arrives too rapidly (velocity), changes too quickly (variability), or is too diverse (variety) to be processed within a local computing structure using traditional approaches and techniques.11 Later iterations of the definition have expanded to include new characteristics such as veracity and value. Particularly ‘value’, as a big-data factor, has grown in importance. Certainly, today’s discussion on big data is most often economically oriented. The burning question concerns how big data helps companies outperform competitors and how it creates value by unleashing the potential of hidden knowledge.12 This leads to the second trend: data analytics. While, traditionally, analytics has been used to find answers to predetermined questions (the search for the causes of certain behaviour, i.e. looking for the ‘why’), analytics of big data leads to finding connections and relationships between data that are unexpected and that were previously unknown.13 Through the use of modern analytics tools, big data makes it possible to see patterns where none actually exist because large quantities of data can offer connections that radiate in all directions.14,15 By employing sophisticated analytic techniques, data’s value shifts from its primary use to its potential future uses or, as this book calls them at times, reuses
Data-driven companies have exhibited particular interest in personal data. While this type of data is relatively easy to monetise, e.g. through behavioural advertising, it is also strictly regulated and protected on the human rights level.16 This has been noticeably demonstrated in the EU, where it is believed that having control over personal data is a fundamental right.17
9 ibid.
10 Authors often write about big data in the context of a big-data revolution. Mayer-Schönberger and Cukier (n 4). See also Omer Tene and Jules Polonetsky, ‘Big Data for All: Privacy and User Control in the Age of Analytics’ (2013) 11 Northwestern Journal of Technology and Intellectual Property 240.
11 ISO/IEC JTC 1, ‘Big Data Preliminary Report 2014’ (ISO 2015) 5.
12 James Manyika and others, Big Data: The Next Frontier for Innovation, Competition, and Productivity (McKinsey Global Institute 2011).
13 Lokke Moerel, Big Data Protection: How to Make the Draft EU Regulation on Data Protection Future Proof (Tilburg University 2014) 8.
14 danah boyd and Kate Crawford, ‘Six Provocations for Big Data’ (A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, Oxford, September 2011).
15 While this thesis refers to big data in the sense of the ‘4-Vs definition’ as explained above, in the literature big data may also have other meanings; for example, it can be used as a synonym for data analytics. ibid.
16 See section 2.2 below.
17 European Commission, ‘How Does the Data Protection Reform Strengthen Citizens’ Rights?’ (2016) ec.europa.eu/newsroom/just/document.cfm?doc_id=41525 accessed 25 August 2020.
1.2 The individual in the data-driven (big-data) economy
In estimating the impacts of the data economy on individuals, authors in the economic field seem to contradict each other. Some believe that data can play a significant economic role to the benefit of both private commerce and national economies and see a great benefit of the data-driven economy for the well-being of citizens.18 In contrast, others warn that the big-data economy in fact decreases consumer surplus.19 Their argument is in line with those who question data-economy benefits due to undermining fundamental rights and freedoms20 (consistent with concerns expressed elsewhere in this book).
To depict the risks to an individual in a data-driven economy, the analysis below focuses on a limited number of core values (privacy, transparency, autonomy, equality, and power symmetry) that can be compromised as a result of data-driven business practices.21 As is demonstrated in this book, when the five values cannot be adequately pursued, individuals may be unable to effectively control personal data, hence legal mechanisms may be necessary to mitigate the risk.
1.2.1 Compromised privacy
Privacy is a concept that allows for multiple definitions.22 Chapter 2 of this book provides a detailed analysis of the term and traces attempts to capture its meaning. For now, it suffices to understand privacy in its ordinary sense: as an attribute of things that affect or belong to private individuals, that are generally distinct from the public, and that are kept confidential and secret (e.g. not disclosed to others and kept from public knowledge or observation).23
The data-driven economy often gives the impression that privacy has been eliminated or is even dead.24 Mark Zuckerberg, Facebook’s CEO, argued that privacy has fundamentally evolved in recent years and can no longer be seen as a social
18 James Manyika and others (n 12) 1–2. See also OECD, ‘Data-driven Innovation: Big Data for Growth and Well-being’ (n 3). The benefits listed are convenience, self-expression, reduced cost, new knowledge, and security.
19 Anna Bernasek and DT Mongan, All You Can Pay (Nation Books 2015).
20 See e.g. Tal Z Zarsky, ‘ “Mine Your Own Business!”: Making the Case for the Implications Of the Data Mining of Personal Information in the Forum of Public Opinion’ (2002) 5 Yale Journal of Law and Technology 1.
21 This section takes inspiration from Richards and King’s framework of three paradoxes of big data: the transparency, the identity, and the power paradox. Neil M Richards and Jonathan J King, ‘Three Paradoxes of Big Data’ (2013) 66 Stanford Law Review Online 41.
22 Daniel J Solove, ‘Conceptualizing Privacy’ (2002) 90 California Law Review 1087, 1088.
23 The Concise Oxford Dictionary (8th edn, Oxford University Press 1990); Black’s Law Dictionary (2nd edn, The Lawbook Exchange 1910).
24 Neil M Richards and Jonathan J King, ‘Big Data Ethics’ (2014) 49 Wake Forest Law Review 393, 409.
norm.25 While it is true that privacy as a social norm has been transformed, it has not lost any of its strength. On the contrary, considering the many new types of privacy violations, some of which are mentioned below, privacy has never been more relevant. Zuckerberg himself is proof. In a photo shared via Twitter in the summer of 2016, one can see his computer, on which the camera and headphone jack are covered with tape, and the email client he uses is Thunderbird (a popular email client among the tech-savvy).26 Zuckerberg’s example may sound anecdotal, but it is an indicator of a wider trend, suggesting that people increasingly care about keeping their work and conversations private.
In the data-driven economy, dataveillance is what most apparently puts privacy at risk. Dataveillance is the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons.27 In the data economy, in which individuals’ behaviour and all their actions are increasingly datafied, dataveillance is easy and cheap to conduct. As a result, more individuals and larger populations can be monitored.28 Dataveillance can be particularly dangerous because it enables interference regarding facts that someone would rather keep secret. For example, a person may share information about her hobbies or favourite books but not information about her sexual orientation. However, by using big-data techniques, this information can be predicted anyway. Kosinski, Stillwell, and Graepel have shown how a range of highly sensitive personal characteristics—including sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, and parental separation—can be predicted with high accuracy on the basis of Facebook likes.29
In the big-data economy, even anonymised data cannot guarantee privacy. In fact, anonymised data can be as useful as personal data in many cases.30 A typical example may involve a company that wants to personalise its marketing campaigns with the help of profiling. The use of personal data may be helpful to assess which
25 Bobbie Johnson, ‘Privacy No Longer a Social Norm, Says Facebook Founder’ The Guardian (Las Vegas, 11 January 2010) www.theguardian.com/technology/2010/jan/11/facebook-privacy accessed 25 August 2020.
26 Katie Rogers, ‘Mark Zuckerberg Covers His Laptop Camera. You Should Consider It, Too’ The New York Times (22 June 2016) www.nytimes.com/2016/06/23/technology/personaltech/markzuckerberg-covers-his-laptop-camera-you-should-consider-it-too.html accessed 25 August 2020.
27 Roger Clarke, ‘Introduction to Dataveillance and Information Privacy, and Definitions of Terms’ (Roger Clarke’s Website, 24 July 2016) www.rogerclarke.com/DV/Intro.html#Priv accessed 25 August 2020.
28 ibid.
29 Michal Kosinski, David Stillwell, and Thore Graepel, ‘Private Traits and Attributes Are Predictable from Digital Records of Human Behavior’ (2013) 110 Proceedings of the National Academy of Sciences of the United States of America 5802.
30 Daniel Bachlechner and others, ‘WP1 Mapping the Scene: D1.2 Report on the Analysis of Framework Conditions’ (Deliverable for the EuDEco H2020 Project, 2015) 30 www. universiteitleiden.nl/ binaries/ content/ assets/ rechtsgeleerdheid/ instituut- voor- metajuridica/ d1.2_ analysisofframeworkconditions-v1_2015-08-31-1.pdf accessed 25 August 2020.
people are potentially interested in specific products or services, but aggregated data at the street or neighbourhood level may be similarly useful and cheaper to process. Inferring information from group profiles supports predictions about someone’s personal circumstances. As soon as ‘[t]hree or four data points of a specific person match inferred data (a profile), which need not be personal data . .’,31 it is possible accurately to predict characteristics of individual users.32
The flow of data among the actors in the data-driven economy escalates the risk of privacy intrusions. This is why Nissenbaum believes that meeting individual expectations about the flow of personal information sits at the core of privacy.33 Personal data is often acquired from a number of data sources, including data brokers, by means of data combination. For example, Facebook’s own databases were being merged with detailed dossiers obtained from commercial data brokers about users’ offline lives.34 In this way, Facebook improves its own data with categories that users did not share or want to reveal on Facebook. If information is used in contexts that are at odds with individuals’ expectations, this can lead to feelings of discomfort.35
1.2.2 Lack of transparency
Transparency describes something that is easy to perceive or detect and is open to scrutiny. In contrast, non-transparency can be illustrated with the metaphor of a black box: a complex system or device whose internal workings are hidden or not readily understood.36 In the context of data-driven decision-making, the black box metaphor stands for outcomes that emerge without satisfactory explanation. Transparency is the second value at risk in the era of the data-driven economy. Although big data promises to make the world more transparent, its collection is often invisible and its tools and techniques opaque, curtained off by layers of physical, legal, and technical protection.37
Non-transparent processing of data occurs in all the stages of the so-called datadriven value chain: when data is acquired, when it is analysed, and when it is used. The ubiquitous and automated collection of data in the data-driven economy is
31 Mireille Hildebrandt, ‘Slaves to Big Data. Or Are We?’ (2013) 17 IDP Revista de Internet, Derecho y Política 27, 33.
32 Ariel Porat and Lior J Strahilevitz, ‘Personalizing Default Rules and Disclosure with Big Data’ (2014) 112 Michigan Law Review 1417, 1440.
33 Helen Nissenbaum, Privacy in Context (Stanford University Press 2010).
34 Julia Angwin, Terry Parris Jr, and Surya Mattu, ‘Facebook Is Quietly Buying Information from Data Brokers about Its Users’ Offline Lives’ Business Insider (30 December 2016) www.businessinsider. com/facebook-data-brokers-2016-12?r=UK&IR=T accessed 25 August 2020.
35 Nissenbaum (n 33) 21.
36 See ‘black box, n’ (Lexico) https://en.oxforddictionaries.com/definition/black_box accessed 25 August 2020.
37 Richards and King (n 21) 42.
opaque. Although law typically requires data collectors to explain the ways and circumstances in which personal data is collected, used, and shared, it has been shown that individuals have difficulty understanding what has been presented to them and what they have consented to.38 In the analysis stage, the key problem is that little is known about the processes used to derive all sorts of findings and that hidden algorithms are shrouded in secrecy and complexity. Hardly anyone can fully capture how algorithms work and monitor their actions. For example, online data marketplaces may be used as a source of numerous data points for an algorithm to determine a customer’s creditworthiness.39 Because of a bad credit score calculated on the basis of aggregated information, a consumer will be charged more, but she will never understand how exactly this amount was calculated or know what information the marketplaces provided.40 Sometimes, not even engineers working with the algorithms are fully able to capture their nature and monitor their actions.41
The black box problem is duplicated in the cloud computing environment, mainly due to indefinite and non-transparent storage. In most cases, individuals are unaware of what occurs in a cloud and how data can be put to (secondary) use. Data can be shared with third parties, sold to advertisers, or handed over to the government. This loss of transparency on the internet can result in a feeling of powerlessness.42
1.2.3 Limited choice and autonomy
Faden and Beauchamp define autonomy in practical terms as ‘the personal rule of the self by adequate understanding, while remaining free from controlling interferences by others and from personal limitations that prevent choice’.43 Three dimensions of autonomy stem from this definition: freedom from interference by
38 Bart Custers, Simone van der Hof, and Bart Schermer, ‘Privacy Expectations of Social Media Users: The Role of Informed Consent in Privacy Policies’ (2014) 6 Policy and Internet 268, 278.
39 See e.g. Mikella Hurley and Julius Adebayo, ‘Credit Scoring in the Era of Big Data’ (2016) 18 Yale Journal of Law and Technology 148.
40 ibid.
41 ‘[e]ven those on the inside can’t control the effects of their algorithms. As a software engineer at Google, I spent years looking at the problem from within . . .’ David Auerbach, ‘The Code We Can’t Control’ Slate (14 January 2015) www.slate.com/articles/technology/bitwise/2015/01/black_box_society_by_frank_pasquale_a_chilling_vision_of_how_big_data_has.html accessed 25 August 2020.
42 Bruce Schneier, Data and Goliath (WW Norton & Company 2015) 115. As Schneier puts it: ‘[t]rust is our only option. There are no consistent or predictable rules. We have no control over the actions of these companies. I can’t negotiate the rules regarding when yahoo will access my photos on Flicker. I can’t demand greater security for my presentations on Prezi or my task list on Trello. I don’t even know the cloud providers to whom those companies have outsourced their infrastructures . . . And if I decide to abandon those services, chances are I can’t easily take my data with me’.
43 Bart Schermer, Bart Custers, and Simone van der Hof, ‘The Crisis of Consent: How Stronger Legal Protection May Lead to Weaker Consent in Data Protection’ (2014) 16 Ethics & Information Technology 171, 174.
others, free choice, and self-governance. The examples below show how big data undermines each of them.
Autonomy, particularly the free-choice aspect, can be restricted as a result of limited confidentiality and privacy of personal data traces on the internet. Knowing about the mass surveillance powers of the US National Security Agency (NSA) might deter us from using a US online service.44 The abstention from an action or behaviour due to the feeling of being observed is described as a chilling effect. 45 However, in some cases, the feeling of being watched creates a nudge for individuals to act. For example, research has shown that people pay more for coffee on the honour system46 if eyes are depicted over the collection box.47
Another example of compromised autonomy is linked to non-transparent data processing and decision-making. In 2009, Eli Pariser noted that the news he received and search results that appeared on Google differed substantially from those viewed by his colleagues.48 He soon realised that the reason was his personalised news website. Based on his user profile and corresponding group profiles, the website was able to learn about his inferred political interests, which in turn meant that it could give more prominence to his favourite political group’s media items. He described the situation as a filter bubble: ‘a synonym for a unique universe of information for each of us’.49 The filter bubble poses a risk of seriously limiting someone’s free choice. For example, when users of such personalised services form their political ideas, they may encounter fewer opinions or political arguments.50
1.2.4 Discrimination
The key objective of the (personal) data-driven decision-making is to differentiate between individuals. Clearly, such decisions can have important consequences for individuals and can work to both their advantage and disadvantage. Certain
44 Glenn Greenwald, No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State (Metropolitan Books 2014).
45 Jonathon W Penney, ‘Chilling Effects: Online Surveillance and Wikipedia Use’ (2016) 31 Berkeley Technology Law Journal 117.
46 A system of payment or examinations which relies solely on the honesty of those concerned.
47 Ryan M Calo, ‘The Boundaries of Privacy Harm’ (2011) 86 Indiana Law Journal 1131, 1147.
48 Eli Pariser, ‘Beware Online Filter Bubbles’ (March 2011) www.ted.com/talks/eli_pariser_beware_ online_filter_bubbles?language=en accessed 25 August 2020.
49 ibid.
50 Filter bubbles could even interfere with collective goods such as democracy. Harvard law professor Jonathan Zittrain explained in 2010 how ‘Facebook could decide an election without anyone ever finding out’, after the tech giant secretly conducted a test in which it was allegedly able to increase voter turnout by 340,000 votes around the country on election day simply by showing users a photo of someone they knew saying ‘I voted’. Trevor Timm, ‘You May Hate Donald Trump. But Do You Want Facebook to Rig the Election against Him?’ The Guardian (19 April 2016) www.theguardian.com/ commentisfree/2016/apr/19/donald-trump-facebook-election-manipulate-behavior accessed 25 August 2020. See also Robert M Bond and others, ‘A 61-Million-Person Experiment in Social Influence and Political Mobilization’ (2012) 489 Nature 295.
practices are legally allowed, although it could be argued that they are ethically disputable. For example, some online platforms are able to use the information collected by consumers to their detriment: by setting the price as close as possible to the maximum price that someone is willing to pay, they are able to exploit consumers’ price sensitivity.51 This is an example of price discrimination, which may become increasingly aggressive given the level of dataveillance on the internet.52
In addition, data-driven decisions can also lead to discriminatory practices that cross the boundaries of what is legally acceptable. Discrimination that occurs when people are treated differently on the basis of protected grounds is, in principle, prohibited regardless of whether it occurs in a direct or indirect way.53 An employer may refuse a candidate because an internet (social media) search reveals how old she is. She may be in her 60s and therefore too close to retirement, or she may be in her 30s and therefore too likely to become pregnant. This would constitute illegal discrimination on the grounds of age or sex.54 Data-driven decision-making may also lead to hidden discrimination. Group profiles inferred from big data are often used as a tool to make decisions about the members of the group, but not every group characteristic can justify different treatment. Characteristics such as address code can be legitimate factors in differentiation, but they may mask ethnicity or religion—both of which are protected grounds.55
1.2.5 Power and control asymmetries
In the data-driven economy, power is linked to two dimensions: (1) the access to data and control over it; and (2) the ability of sophisticated data processing.56
51 Bernasek and Mongan (n 19).
52 Price discrimination and price differentiation are synonyms in economic jargon.
53 Francesca Bosco and others, ‘Profiling Technologies and Fundamental Rights and Values: Regulatory Challenges and Perspectives from European Data Protection Authorities’ in Serge Gutwirth, Ronald Leenes, and Paul De Hert (eds), Reforming European Data Protection Law (Springer 2015) 3, 19.
54 Orla Lynskey, The Foundations of EU Data Protection Law (Oxford University Press 2015) 199.
55 Bart Custers, The Power of Knowledge: Ethical, Legal, and Technological Aspects of Data Mining and Group Profiling in Epidemiology (Wolf Legal Publishers 2004) 114.
56 Mark Andrejevic and Kelly Gates, ‘Big Data Surveillance: Introduction’ (2014) 12 Surveillance & Society 185, 190. Although the power asymmetry is most apparent in the relationship between datadriven businesses and individuals, it can also be observed in relationships between other actors in the economy. Small businesses often become dependent on big-data holders and powerless in relation to big data-holders. For example, small businesses have limited access to many valuable databases. Bart Custers and Daniel Bachlechner, ‘Advancing the EU Data Economy: Conditions for Realizing the Full Potential of Data Reuse’ (2017) 22 Information Polity 291, 293 https://papers.ssrn.com/sol3/papers. cfm?abstract_id=3091038 accessed 11 September 2020. However, power asymmetry affects authorities too, as these authorities may struggle to understand the data-driven economy and its consequences. ‘[T]o understand what is going on we have to go for geeks,’ stated the director of the European Consumer Organisation to express her frustration with the data economy black box. Monique Goyens, ‘Welcome speech’ (EDPS-BEUC conference, Brussels, 29 September 2016) https://edps.europa.eu/ data- protection/ our- work/ publications/ events/ edps- beuc- conference- big- data- individual- rightsand_de accessed 25 August 2020.
Individuals are at a disadvantage regarding both dimensions: they have trouble controlling their personal data and are unable to understand how it can be processed, let alone actually process it.
To a large extent, the asymmetry between data controllers and individuals stems from the architecture of the data-collecting platforms. Because of these platforms’ design, it is easy for them to take full ownership of users’ input, e.g. photos, comments, and texts. In such circumstances, users’ control over data fades. Until 2016, users of the dating app Tinder were asked to give away control of their pictures, videos, and chat logs forever.57 Although individuals certainly benefit from the digital economy, e.g. by being able to use the Amazon online shopping tool, they pay (often unknowingly) for these services with their non-monetary assets, and their input is not always fairly evaluated.58 In addition, the architecture of the platforms disables transparency. As explained above, algorithms that drive the functioning of the platforms are shrouded in secrecy and complexity, and few people can fully capture how they work.
The asymmetry becomes even more apparent when personal data is processed as part of decision-making. Data controllers can leverage the collected personal data when they make commercial decisions, whereas individuals have little overview of the process. For instance, based on a personal data analysis, employers can determine employees’ performance scores. Consequently, individuals may face a lower salary or risk being fired. Because such decisions are typically made based on a multi-factor and multi-level analysis of workers’ data, an individual may have trouble identifying what exactly is included in the performance review that led to such a ‘verdict’.59
1.3 The need for enhanced data subject control and rights— regulatory response and the motivation for this book
In the early 2010s, as a response to the potentially hazardous effects of the rapidly developing information revolution, the European Commission (EC) started contemplating changes to Directive 95/46/EC of the European Parliament and of
57 In March 2016, the Norwegian Consumer Council filed a complaint regarding unfair contractual terms in the terms of use for the mobile application Tinder. As a result, Tinder later amended the disputable parts of the terms. David Meyer, ‘Tinder Is in Trouble over Its “Unfair” User Terms’ Fortune (3 March 2016) http://fortune.com/2016/03/03/tinder-norway-trouble/ accessed 25 August 2020. For a more detailed analysis of Tinder and other apps’ terms by the Norwegian Consumer Council see Forbrukerrådet, ‘Appfail? Threats to Consumers in Mobile Apps’ (2016) https://docplayer.net/ 38357620-Appfail-threats-to-consumers-in-mobile-apps.html accessed 25 August 2020.
58 Aleks Jakulin, ‘Why Let Google Show Excerpts of Your Content in Their Search Results without Partaking in the Lucrative Search Advertising Revenue?’ Twitter (29 April 2016) https://twitter.com/ aleksj/status/725998664687206400 accessed 25 August 2020.
59 Bart Custers and Helena Ursic, ‘Worker Privacy in a Digitalized World under European Law’ (2018) 39 Comparative Labor Law & Policy Journal 323, 340.
the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (the Data Protection Directive (DPD)), the EU’s first framework on data protection adopted in 1995.60 In 2012, the EC published the proposal for the regulation on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (the General Data Protection Directive (GDPR)).61 The objective of the new law was to strengthen data protection and adapt it to the changed circumstances in the globalised and interconnected world. The vision that data protection is a fundamental right was one of the underpinning philosophies that drove the legislative process.62 The regulation was adopted and published in the EU official journal in May 2016. It started to apply two years later, on 25 May 2018.63
Like many other data protection legal instruments, including the DPD, the GDPR contains a section on the rights that the law grants to data subjects (i.e. persons whose data is (re)used). These rights—including the right to be informed, the right to erasure, the right to object, and the right to access—are significant legal developments that were introduced in the 1970s when the first comprehensive data protection guidelines were adopted.64 They are underpinned by an important vision, namely that individuals’ control over their personal data is a constitutive part of the right to data protection, which must be sustained in spite of the new challenging circumstances. In fact, the idea of strong individual control over personal data has been highlighted as one of the key improvements of the GDPR. The European Commission’s information factsheet stated: ‘In this fast-changing environment, individuals must retain effective control over their personal data. This is a fundamental right for everyone in the EU and must be safeguarded.’65
For data subjects who feel overwhelmed with the information overload, the GDPR’s strengthened and extended legal framework meant the promise of more
60 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L281/31.
61 European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)’ COM(2012) 1 final.
62 As acknowledged in the proposal (recital 1) and as previously set out in the EU Charter.
63 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.
64 A 1974 report by the OECD was one of the first prominent data-protection documents that proposed a policy shift from the limited access approach to the control approach. See OECD, ‘Policy Issues in Data Protection and Privacy: Concepts and Perspectives’ (Proceedings of the OECD seminar, June 1974). At the same time, the US developed the so-called ‘fair information practice principles’, a blend of substantive (e.g. data quality, use limitation) and procedural (e.g. consent, access) principles that set standards to facilitate both individual privacy and the promise of information flows in an increasingly technology-dependent, global society. Fred H Cate, ‘The Failure of Fair Information Practice Principles’ in Jane K Winn (ed), Consumer Protection in the Age of the Information Economy (Routledge 2006) 343.
65 European Commission, ‘How Does the Data Protection Reform Strengthen Citizens’ Rights?’ (n 17).
individual control over data and a mitigation strategy against the risk of the bigdata economy. In combination with severe financial penalties, the revamped right to be forgotten, the right to information and access, and the right to data portability have the potential to become a vehicle of data protection law enforcement. In fact, we have already seen some of this happening in the recent actions of privacy advocates such as Max Schrems (and the non-governmental organisation he founded: noyb.eu).66
However, there are still many uncertainties related to data subject rights as a legal response to the risks of the data-driven economy. The guidance by the Court of Justice of the EU and other EU governmental authorities is still developing (although, as is demonstrated in the following chapters, the EU Court and Member States’ courts have been actively adjudicating on these individual rights ever since the GDPR was adopted). Moreover, the applicability of data subject rights is strongly influenced by the changing economic and social (data-driven) context.
This book contributes to the existing guidance on data subject rights by providing a thorough analysis of data subject rights under the new GDPR framework and considering the developing case law. The primary aim that it hopes to achieve is to equip lawyers, activists, privacy professionals, law students, and any other individuals interested in the rapidly evolving area of data protection with a practical and thorough analysis of data subject rights. Secondly, the book provides an understanding of key data subject rights through the prism of the current datadriven economic setting as it strives to point out some important limitations and inefficiencies (while not turning a blind eye on examples of efficient uses).
1.4 A cautionary remark regarding scope
This book is primarily concerned with the implications of the data-driven economy for data subject rights and does not provide an in-depth focus on other relevant dilemmas of data protection law or any other law related to the data economy.
The scope of the legal analysis is limited to EU law. It does not extend to legislations outside of the EEA area, nor does it consider national specifics in the EU Member States, although at times it does reflect on some of them to better illustrate a European provision. The European Convention on Human Rights (ECHR)67 and related jurisprudence is understood as an integral part of EU law.
Choosing EU law as the basis of this study appears appropriate for two major reasons. First, although the EU is a union of sovereign states with their own
66 ‘EU-US Privacy Shield for Data Struck Down by Court’ BBC (16 July 2020) www.bbc.com/news/ technology-53418898 accessed 25 August 2020.
67 Convention for the Protection of Human Rights and Fundamental Freedoms (European Convention on Human Rights, as amended).
national laws, the rules on the EU level are common to all member states and act as a reflection of the EU consensus on adequate legal standards. This approach has been confirmed by the doctrine of direct effect of the EU legislation.68 Through this doctrine, the Court of Justice of the EU (CJEU) has established that the EU regulations such as the GDPR are directly applicable and should be interpreted, except for the limited exceptions, coherently throughout the Union.69 Secondly, while it is clear that the European market is to some extent legally, economically, and culturally fragmented, the general perception and political tendency is to see it as a single market. In recent years, the idea of a single digital market has been placed high on the EU political agenda.70 It is believed that more harmonised legal provisions in the digital realm would reduce the administration cost for European businesses and grant more protection to citizens.
This book focuses on the application of data subject rights in the private, commercial sector. It does not specifically deal with their application in the context of law enforcement or of any other public service. It is, however, acknowledged that the commercial data processing is oftentimes strongly intertwined with the state intervention on data markets. The state plays several roles in this market: it sets the rules, monitors actors’ behaviour, and uses data for its own purposes, such as national safety. For example, the state might collaborate with commercial actors, urging them to share the data that is generated or collocated in the course of their commercial activities.71 This phenomenon, which Schneier calls a ‘public-private partnership on data surveillance’, raises some important concerns, most apparently in relation to citizens’ privacy.72
Not all control (data subject) rights listed in the GDPR are subject to analysis in this book: only six are analysed and commented on in detail. Specifically, the right to rectification in Article 16 of the GDPR and the right to restriction of processing in Article 18 of the GDPR are excluded from the scope. However, this is not to say that these rights are irrelevant to the big-data discussion. The reason for the exclusion is that they share similarities with the right to erasure (Article 17) and the
68 Damian Chalmers, Gareth Davies, and Giorgio Monti, European Union Law: Cases and Materials (Cambridge University Press 2010) 285.
69 Case C-6/64 Flaminio Costa v ENEL [1964] ECLI:EU:C:1964:66.
70 European Commission, ‘Digital Single Market: Bringing Down Barriers to Unlock Online Opportunities’ https://ec.europa.eu/commission/priorities/digital-single-market_en accessed 25 August 2018.
71 Facebook’s transparency report reveals that between January and June 2017 the company received 32,716 requests for information by different US state authorities. ‘Government Requests for User Data’ (Facebook Transparency) https://transparency.facebook.com/country/United%20States/2017-H1/ accessed 25 August 2020. In the same period, Twitter received more than 2,000 requests. ‘Information Requests’ (Twitter Transparency) https://transparency.twitter.com/en/information-requests.html accessed 25 August 2020. The US authorities are not the only ones making use of the private actor data. As Facebook’s and Twitter’s websites demonstrate, virtually every state collaborates with the large data holders.
72 See more in Schneier (n 42) ch 6.
right to object (Article 21). Thus, their limitations and prospects are, to a large extent, reflected in the analysis of the rights in Articles 17 and 21.
Finally, this book does not (systematically) investigate legal enforcement of the rights, apart from those cases that have already been adjudicated by the CJEU and examples of judgments from different Member States. This is because a thorough analysis of legal enforcement across all Member States would require a considerably broader and longer study which goes beyond the scope of this work. That said, an in-depth analysis of the national law enforcement efforts could be the subject of important follow-up research.
1.5 Introducing the main concepts
This book operates with a few key concepts. Therefore, their meaning needs to be clarified at the outset. These concepts are data subjects, data subject rights, and data-driven economy.
Data subjects are identified or identifiable natural persons who can, in particular, be identified, directly or indirectly, by reference to an identification number or to one or more factors specific to their physical, physiological, mental, economic, cultural, or social identity.73 Chapter 2 explains each component of the definition of a data subject as provided in EU law. This book uses the term ‘data subjects’ interchangeably with the terms ‘individuals’ and ‘consumers’ as, in most situations, they overlap. When the difference is decisive, however, this is indicated.
Data subject rights (also referred to as control, micro, or subjective rights) normally refer to the group of rights that data protection law grants to individuals, such as the right to access and the right to object. These rights have formed the core of the data protection law since its beginnings and have found their way into a large number of data protection statutes all over the world.74 Chapters 4 to 8 of this book describe their historical development, their ethical and economic justifications, and the current legal framework.
Data-driven economy serves as an umbrella term for businesses that perform data (re)use as their core activity. Since the term, data-driven economy, only appeared recently, its definition has not been fully established yet. Nonetheless, it is important to bear in mind that the focus of the data-driven economy is always on the secondary use of data, i.e. business models that seek to utilise the existing data in innovative and profitable ways. Therefore, business activities in which data is only generated, collected, or stored are of less interest.
73 GDPR art 4(1).
74 Compare Frederik J Zuiderveen Borgesius ‘Improving Privacy Protection in the Area of Behavioural Targeting’ (PhD Thesis, University of Amsterdam 2014) 88 and 133.
The terms data-driven economy, big-data economy, and data economy are used interchangeably in this book. However, data-driven is not always a synonym for big data. Data-driven refers to business models that use data as a resource, regardless of what type of data it is (structured or unstructured, handled manually or processed by computer, personal or industry data, etc.). Big data refers specifically to vast amounts of data which are particularly useful as a source for analytics and data-driven decision-making. Thus, big data is only one part of all the possible types and uses of data. However, in today’s economic reality, big data sets are those that are most likely to accrue value and where secondary results occur most easily, e.g. as marketing predictions or sales of data.75 Big data is what data-driven companies are most interested in. For this reason, data-driven most often equates to big data.
1.6 Structure
Chapter 1 began with an introductory section on the data-driven economy and the risks this new type of economy imposes on individuals to provide readers with some necessary background information. Section 1.3 explained how the regulator responded to the changes in the developing data economy, emphasising the importance of data subject rights within the updated regulatory framework. Subsequently, the section clarified that the aim of this book is to provide a thorough analysis of data subject rights, with a particular concern for their application in the data-driven environments. Section 1.4 introduced three key concepts that will be used throughout the book: data subjects, data subject (control) rights, and the data-driven economy. Sections 1.5 and 1.6 delineate the scope of the book and outline the structure, respectively.
Chapter 2 outlines the provisions of EU law that are aimed at protecting individuals and personal data in the data-driven economy. The first part of the chapter deals with primary sources such as human rights provisions; the second part turns to secondary legislation, paying special attention to EU data protection law. As mentioned above, non-EU legal sources and national legislation are excluded from the scope of the legal framework, on principle.
Chapter 3 discusses the concept of individual control. Most importantly, the chapter provides an explanation why data protection law is considered one of the most evident expressions of control over personal data. Furthermore, the chapter structures the GDPR’s control-related provisions and sets the stage for their further analysis.
75 Mayer-Schönberger and Cukier (n 4).