scispace - formally typeset
Open AccessJournal ArticleDOI

Hypernudge: Big Data as a mode of regulation by design

Karen Yeung
- 02 Jan 2017 - 
- Vol. 20, Iss: 1, pp 118-136
Reads0
Chats0
TLDR
It is argued that concerns about the legitimacy of these techniques are not satisfactorily resolved through reliance on individual notice and consent, touching upon the troubling implications for democracy and human flourishing if Big Data analytic techniques driven by commercial self-interest continue their onward march unchecked by effective and legitimate constraints.
Abstract
This paper draws on regulatory governance scholarship to argue that the analytic phenomenon currently known as ‘Big Data’ can be understood as a mode of ‘design-based’ regulation. Although Big Data decision-making technologies can take the form of automated decision-making systems, this paper focuses on algorithmic decision-guidance techniques. By highlighting correlations between data items that would not otherwise be observable, these techniques are being used to shape the informational choice context in which individual decision-making occurs, with the aim of channelling attention and decision-making in directions preferred by the ‘choice architect’. By relying upon the use of ‘nudge’ – a particular form of choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives, these techniques constitute a ‘soft’ form of design-based control. But, unlike the static Nudges popularised by Thaler and Sunstein [(20...

read more

Content maybe subject to copyright    Report

King’s Research Portal
DOI:
10.1080/1369118X.2016.1186713
Document Version
Peer reviewed version
Link to publication record in King's Research Portal
Citation for published version (APA):
Yeung, K. (2017). 'Hypernudge': Big Data as a mode of regulation by design. Information, communication and
society, 20(1), 118-136 . https://doi.org/10.1080/1369118X.2016.1186713
Citing this paper
Please note that where the full-text provided on King's Research Portal is the Author Accepted Manuscript or Post-Print version this may
differ from the final Published version. If citing, it is advised that you check and use the publisher's definitive version for pagination,
volume/issue, and date of publication details. And where the final published version is provided on the Research Portal, if citing you are
again advised to check the publisher's website for any subsequent corrections.
General rights
Copyright and moral rights for the publications made accessible in the Research Portal are retained by the authors and/or other copyright
owners and it is a condition of accessing publications that users recognize and abide by the legal requirements associated with these rights.
•Users may download and print one copy of any publication from the Research Portal for the purpose of private study or research.
•You may not further distribute the material or use it for any profit-making activity or commercial gain
•You may freely distribute the URL identifying the publication in the Research Portal
Take down policy
If you believe that this document breaches copyright please contact librarypure@kcl.ac.uk providing details, and we will remove access to
the work immediately and investigate your claim.
Download date: 09. Aug. 2022

1
Hypernudge’: Big Data as a Mode of Regulation by Design’
by
Karen Yeung
Abstract (250 words max)
This paper draws on regulatory governance scholarship to argue that the analytic phenomenon currently
known as ‘Big Data’ can be understood as a mode of ‘design-based’ regulation. Although Big Data
decision-making technologies can take the form of automated decision-making systems, this paper
focuses on algorithmic decision-guidance techniques. By highlighting correlations between data items
that would not otherwise be observable, these techniques are being used to shape the informational
choice context in which individual decision-making occurs, with the aim of channelling attention and
decision-making in directions preferred by the ‘choice architect’. By relying upon the use of ‘nudge’- a
particular form of choice architecture that alters people’s behaviour in a predictable way without
forbidding any options or significantly changing their economic incentives, these techniques constitute a
‘soft’ form of design-based control. But, unlike the static Nudges popularised by Thaler and Sunstein
(2008) such as placing the salad in front of the lasagne to encourage healthy eating, Big Data analytics
nudges are extremely powerful and potent due to their networked, continuously updated, dynamic and
pervasive nature (hence ‘hypernudge’). I adopt a liberal, rights-based critique of these techniques,
contrasting liberal theoretical accounts with selective insights from science and technology studies (STS)
and surveillance studies on the other. I argue that concerns about the legitimacy of these techniques are
not satisfactorily resolved through reliance on individual notice and consent, touching upon the
troubling implications for democracy and human flourishing if Big Data analytic techniques driven by
commercial self-interest continue their onward march unchecked by effective and legitimate constraints.
Professor of Law, Director, Centre for Technology, Ethics, Law & Society (TELOS), The Dickson Poon
School of Law, King’s College London. An earlier version of this paper was presented at Algorithms and
Accountability, an international conference hosted by the NYU Law Institute and the Department of Media, Culture
and Communications at New York University, New York, 28 February 2015. I am grateful to Joris van Hoboken
and Helen Nissenbaum for hosting my visit and providing me with an opportunity to discuss my ideas at such an
immensely stimulating forum. I am also indebted to Barbara Prainsack, Roger Brownsword, Lyria Bennett Moses,
John Coggan, Alessandro Spina and Chris Townley for comments on earlier drafts. All errors remain my own.

2
1. Introduction
It is claimed that society stands at the beginning of a New Industrial Revolution, powered by the engine
of Big Data. This paper focuses on how industry is harnessing Big Data to transform personal digital data
into economic value, described by one leading cyberlawyer as the ‘latest form of bioprospecting’
(Cohen 2012). Although the term ‘Big Data’ is widely used, no universal definition has yet emerged.
Big Data is essentially shorthand for the combination of a technology and a process (Cohen 2012: 1919).
The technology is a configuration of information-processing hardware capable of sifting, sorting and
interrogating vast quantities of data very quickly. The process involves mining data for patterns,
distilling the patterns into predictive analytics, and applying the analytics to new data. Together, the
technology and the process comprise a methodological technique that utilises analytical software to
identify patterns and correlations through the use of machine learning algorithms applied to (often
unstructured) data items contained in multiple data sets, converting these data flows into a particular,
highly data-intensive form of knowledge (Cohen 2012: 1919). A key contribution of Big Data is the
ability to find useful correlations within datasets not capable of analysis by ordinary human assessment (Shaw
2014). As boyd and Crawford observe, ‘Big Data’s value comes from patterns that can be derived from
making connections about pieces of data, about an individual, about individuals in relation to others,
about groups of people, or simply about the structure of information itself. Big Data is important
because it refers to an analytic phenomenon playing out in academia and industry (boyd and Crawford
2012: 662), and it is this understanding of Big Data as a methodological approach and an analytic
phenomenon that this paper adopts.
I argue that Big Data’s extensive harvesting of personal digital data is troubling, not only due to its
implications for privacy, but due to the particular way in which that data is being utilised to shape
individual decision-making to serve the interests of commercial Big Data barons. My central claim is
that, despite the complexity and sophistication of their underlying algorithmic processes, these
applications ultimately rely on a deceptively simple design-based mechanism of influence -nudge. By
configuring and thereby personalising the user’s informational choice context, typically through
algorithmic analysis of data streams from multiple sources claiming to offer predictive insights
concerning the habits, preferences and interests of targeted individuals (such as those used by on-line
consumer product recommendation engines), these nudges channel user choices in directions preferred
by the choice architect through processes that are subtle, unobtrusive yet extraordinarily powerful. By
characterizing Big Data analytic techniques as a form of nudge, this provides an analytical lens for
evaluating their persuasive, manipulative qualities and their legal and political dimensions. I draw on
insights from regulatory governance scholarship, behavioural economics, liberal political theory,
information law scholarship, Science & Technology Studies (STS) and surveillance studies to suggest

3
that, if allowed to continue unchecked, the extensive and accelerating use of commercially driven Big
Data analytic techniques may seriously erode our capacity for democratic participation and individual
flourishing.
2. Big Data as a form of design-based regulation
My analysis begins by explaining how Big Data algorithmic techniques seek systematically to influence
the behaviour of others, drawing on a body of multidisciplinary scholarship concerned with
interrogating ‘regulatory governance’ regimes and various facets of the regulatory governance process.
2.1 Design-based regulatory techniques
Regulation or regulatory governance is, in essence, a form of systematic control intentionally aimed at
addressing a collective problem. As Julia Black puts it, [r]egulation, or regulatory governance, is the
organised attempt to manage risks or behaviour in order to achieve a publicly stated objective or set of
objectives (Black 2014: 2).
1
Many scholars analyse regulation as a cybernetic process involving three
core components that form the basis of any control system i.e. ways of gathering information
(‘information gathering and monitoring’); ways of setting standards, goals or targets (‘standard-
setting’); and ways of changing behaviour to meet the standards or targets (‘behaviour modification’)
(Hood et al 2001). Within this literature, the techniques employed by regulators to attain their desired
social outcome are well established as an object of study (Morgan and Yeung 2007). While legal
scholars tend to focus on traditional ‘command and control’ techniques in which the law prohibits
specified conduct, backed by coercive sanctions for violation, cyberlawyers and criminologists have
explored how ‘design’ (or ‘code’) operates as a regulatory instrument (Lessig 1999; Zittrain 2007; von
Hirsh et al 2000; Clarke and Newman 2005). Although design and technology can be employed at the
information-gathering phase (eg. the use of CCTV cameras to monitor behaviour) and behaviour
modification phase of the regulatory cycle (eg. car alarms which trigger if unauthorised interference is
detected), design-based regulation embeds standards into design at the standard-setting stage in order to
foster social outcomes deemed desirable (such as ignition locking systems which prevent vehicle engines
from starting unless the occupants’ seatbelts are fastened), thus distinguishing design-based regulation
from the use of technology to facilitate regulatory purposes more generally (Yeung 2008; Yeung 2016).
2.2 Choice architecture and ‘nudge’ as instruments for influencing behaviour
1
This definition amalgamates various refinements to the definition of regulation which Julia Black has
offered over time: see Black (2001), Black (2008: 139) and Black (2014:2).

4
Since 2008, considerable academic attention has focused on one kind of design-based approach to
shaping behaviour nudge - thanks to Thaler and Sunstein, who claim that a nudge is ‘any aspect of
choice architecture that alters people’s behaviour in a predictable way without forbidding any options or
significantly changing their economic incentives.’ (Thaler and Sunstein 2008:6). The intellectual
heritage of Nudge rests in experiments in cognitive psychology which seek to understand human
decision-making, finding considerable divergence between the rational actor model of decision-making
assumed in microeconomic analysis and how individuals actually make decisions due to their pervasive
use of cognitive shortcuts and heuristics (Kahneman and Tversky 1974; 1981). Critically, much
individual decision-making occurs subconsciously, passively and unreflectively rather than through
active, conscious deliberation (Kahneman 2013). Drawing on these findings, Thaler and Sunstein
highlight how the surrounding decisional choice context can be intentionally designed in ways that
systematically influence human decision-making in particular directions. For example, to encourage
customers to choose healthier food items, they suggest that cafeteria managers place the healthy options
more prominently such as placing the fruit in front of the chocolate cake (Thaler and Sunstein 2008:
1). Due to the ‘availability’ heuristic and the influence of ‘priming’, customers will systematically tend
to opt for the more ‘available’ healthier items.
2.3 Big Data analytics as informational choice architecture
To understand how Big Data analytics techniques utilise nudge, we can distinguish two broad
configurations of Big Data driven digital decision-making analytic processes:
(a) automated decision-making processes: Many common transactions rely upon automated decision-making
processes, ranging from ticket dispensing machines to highly sophisticated techniques used by some
financial institutions offering consumer credit, such as pay-day loan company Wonga
(https://www.wonga.com/loans-online). Although varying widely in complexity and sophistication,
not all of which rely on Big Data driven analytics, these decision-processes automatically issue some kind
of ‘decision’ without any need for human intervention beyond user input of relevant data (or data
tokens) and thus constitute a form of action-forcing (or coercive) design (Brownsword 2006; Yeung &
Dixon-Woods 2010); and
(b) digital decision guidance processes: In contrast, digital decision ‘guidance’ processes are designed so that
it is not the machine, but the targeted individual, who makes the relevant decision. These technologies
seek to direct or guide the individual’s decision-making processes in ways identified by the underlying
software algorithm as ‘optimal’, by offering suggestions intended to prompt the user to make decisions
preferred by the choice architect (Sellinger and Seager 2012).

Citations
More filters
Journal Article

Thinking fast and slow.

TL;DR: Prospect Theory led cognitive psychology in a new direction that began to uncover other human biases in thinking that are probably not learned but are part of the authors' brain’s wiring.
Journal ArticleDOI

Taking Rights Seriously

TL;DR: In this paper, a judge in some representative American jurisdiction is assumed to accept the main uncontroversial constitutive and regulative rules of the law in his jurisdiction and to follow earlier decisions of their court or higher courts whose rationale, as l
Journal ArticleDOI

The morality of freedom

TL;DR: The idea of speculative reason has been used to resist the moral concept of freedom of choice for a long time as discussed by the authors, and to attack the moral concepts of freedom and, if possible, render it suspect.
Journal ArticleDOI

Algorithms at Work: The New Contested Terrain of Control

TL;DR: This work uses Edwards’ (1979) perspective of “conteste... to explore how algorithms may reshape organizational control in the rapidly changing environment.
Journal ArticleDOI

Ethical Implications and Accountability of Algorithms

TL;DR: In this paper, the authors conceptualize algorithms as value-laden, rather than neutral, in that algorithms create moral consequences, reinforce or undercut ethical principles, and enable or diminish stakeholder rights and dignity.
References
More filters
Book

Judgment Under Uncertainty: Heuristics and Biases

TL;DR: The authors described three heuristics that are employed in making judgements under uncertainty: representativeness, availability of instances or scenarios, and adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.
Journal ArticleDOI

The Framing of Decisions and the Psychology of Choice

TL;DR: The psychological principles that govern the perception of decision problems and the evaluation of probabilities and outcomes produce predictable shifts of preference when the same problem is framed in different ways.
Book ChapterDOI

World Medical Association Declaration of Helsinki: Ethical Principles for Medical Research Involving Human Subjects

TL;DR: Comparing the socialist nature of many European counties, there is a requirement that provision be made for patients to be made whole regardless of the outcomes of the trial or if they happened to have been randomized to a control group that did not enjoy the benefits of a successful experimental intervention.
Journal Article

World Medical Association Declaration of Helsinki: ethical principles for medical research involving human subjects.

WMADo Helsinki
- 19 Dec 2000 - 
TL;DR: The Helsinki Declaration on Ethical Principles for Medical Research Involving Human Subjects, adopted by the World Medical Assembly, is presented.
Journal Article

Thinking fast and slow.

TL;DR: Prospect Theory led cognitive psychology in a new direction that began to uncover other human biases in thinking that are probably not learned but are part of the authors' brain’s wiring.
Related Papers (5)
Frequently Asked Questions (1)
Q1. What are the contributions in this paper?

This paper draws on regulatory governance scholarship to argue that the analytic phenomenon currently known as ‘ Big Data ’ can be understood as a mode of ‘ design-based ’ regulation. Although Big Data decision-making technologies can take the form of automated decision-making systems, this paper focuses on algorithmic decision-guidance techniques. By highlighting correlations between data items that would not otherwise be observable, these techniques are being used to shape the informational choice context in which individual decision-making occurs, with the aim of channelling attention and decision-making in directions preferred by the ‘ choice architect ’. An earlier version of this paper was presented at Algorithms and Accountability, an international conference hosted by the NYU Law Institute and the Department of Media, Culture and Communications at New York University, New York, 28 February 2015.