Skip to main content
SearchLoginLogin or Signup

Chapter 3. The dark (pattern) ages

Published onDec 16, 2024
Chapter 3. The dark (pattern) ages
·

Last thing I remember

I was running for the door

I had to find the passage back

To the place I was before

‘Relax’, said the night man

‘We are programmed to receive

You can check out any time you like

But you can never leave!’

‘Hotel California’, The Eagles (1977)


3.1. Introduction

The concept of ‘design patterns’ originates from architecture and refers to documented and proven solutions to (architectural) design problems.1 The idea of design patterns has since found its way into the field of digital architectures, starting with software development2 and spreading to various fields of computer science and software engineering: distributed architectures, cybersecurity, privacy and user interface (UI) design.3 Over time, system designers started taking note of commonly used, inefficient design patterns that seemed to have more adverse consequences than desirable ones, and called them ‘anti-patterns’.4 Design practices deemed to be anti-patterns do not have to start off as such – what was once deemed a state-of-the-art design solution may simply become deprecated with time.5

Sometime in 2009, user experience (UX) expert Harry Brignull started noticing instances of ‘bad’ UI design that did not seem to be mistakes, but rather appeared to have been designed on purpose to trick the user into doing something they otherwise would not do and to benefit the company deploying the designs.6 In 2010, he proposed that such design practices could be referred to as ‘dark patterns’.7 In August 2010, he launched darkpatterns.org,8 a website devoted to educating internet users about dark patterns and shaming companies that use them. One way to distinguish between anti-patterns and dark patterns, proposed by Bosch et al., is to think of anti-patterns as the ‘Don’ts’ for good intentions, and of dark patterns as the potential ‘Dos’ for malicious intentions.9

Since 2010, researchers in the field of Human-Computer Interaction (HCI) – an interdisciplinary domain of study that draws on computer science, behavioural, sociological, anthropological and industrial design insights to investigate how we engage with technologies10 – have been documenting the use of dark patterns for various purposes on a multitude of digital platforms: social media platforms, mobile apps, video games, video-streaming platforms and e-commerce websites, to name a few.

While this is a thriving and growing field of inquiry that has produced a great wealth of knowledge about potentially problematic instances of UI design, it is much less clear what regulators could do to address the proliferation of dark patterns. To lay the groundwork for the regulatory discussion in the following chapters, this chapter will define dark patterns for the purposes of this study; explain how they work; illustrate why they may prove to be an elusive regulatory target; and provide an overview of the behavioural insights on users’ reactions to dark patterns, which are helpful for assessing whether regulators ought to be concerned about dark patterns.

The rest of the chapter is structured as follows. Section 3.2 starts by looking at dark patterns from an HCI perspective, engaging with the changing meaning of the concept and the evolving terminology in this body of literature, and defining dark patterns for the purposes of this study. It then explains how these practices work from a behavioural sciences perspective; this understanding is necessary for the assessment of whether dark patterns ought to be regulated and mapping policy solutions. Section 3.3 starts by providing an overview of prior studies that have documented dark patterns, in order to illustrate the breadth of the dark patterns problem, which may pose challenges for regulators. It then introduces the dark patterns my investigation is concerned with – the Shopping dark patterns11 that Mathur et al. (2019) found in their study of e-commerce websites.12 Section 3.4 outlines the findings of behavioural studies that probe users’ reactions to Shopping dark patterns. Section 3.5 summarises these discussions, and reflects on what the future might hold for dark patterns.

3.2. What are dark patterns?

3.2.1. The ongoing quest for a definition of dark patterns

As the chapter introduction notes, the last decade has witnessed an explosion in studies documenting dark patterns on various digital platforms. Many such studies propose their own definitions of dark patterns. There are, however, two obstacles to pinning down the meaning of the term ‘dark pattern’: conceptual inconsistencies in this body of literature (sub-section A) and evolving terminology (sub-section B). Sub-section C defines dark patterns for the purposes of my investigation.

A. Conceptual inconsistencies and ways to overcome them

While prior literature is not short of attempts to define dark patterns, it is also not short of conceptual inconsistency. In a 2021 paper, Mathur, Mayer and Kshirsagar (hereafter Mathur et al. (2021))13 compared 19 definitions of dark patterns that had been found in prior studies and government materials, with the aim of providing analytical clarity and outlining the normative foundations for research on dark patterns in HCI. Their investigation showed that the compared definitions differed in terms of the following: the UI characteristics that can affect users; the mechanism of effect for influencing users; the role of the interface designer (although most definitions require intention); and the benefits and harms resulting from the user interface design.14 The analysis also revealed within-study and across-studies discrepancies between the proposed definitions and the types of dark patterns described in prior work.15 Further, the researchers found that while the dark patterns literature has produced many rich descriptions of potentially problematic design practices, it has engaged to a lesser extent with normative considerations as to what makes a dark pattern problematic.16 The researchers argue that the lack of conceptual consistency in the field and the lack of explicit discussions of normative concerns may prevent regulatory action with regard to dark patterns;17 this concern is echoed by Gunawan et al.18 I discuss ways to provide more conceptual consistency here, and in section 3.5 I reflect on the dearth of normative engagement in this body of literature.

To help alleviate the definitional murkiness surrounding the concept of dark patterns, Mathur et al. (2021) suggest referring to the high-level, functional (i.e. referring to the mechanism of influence on the user) attributes of dark patterns developed in Mathur et al. (2019).19 According to Mathur et al. (2019), dark patterns may be asymmetric, covert, deceptive, information-hiding and/or restrictive.20 Asymmetric dark patterns impose unequal burdens on the choices available to the user by making the choices that benefit the trader more prominent. Covert dark patterns steer users’ decisions without their knowledge by weaponising their cognitive biases or through the use of colour and style. Deceptive dark patterns induce false beliefs in users either through affirmative misstatements, misleading statements or omissions. Information-hiding dark patterns obscure or delay the presentation of necessary information to the user. Restrictive dark patterns reduce the choices available to the user, or eliminate those choices altogether. Some dark patterns may exhibit several of these attributes.21 Mathur et al. (2021) propose a further categorisation based on how dark patterns modify choice architecture for users. ‘Choice architecture’ is a term coined by Thaler and Sunstein in their seminal work on behavioural nudges, and it refers to the context in which people make decisions.22 Deceptive and information-hiding dark patterns manipulate the flow of information to users. Asymmetric, covert and restrictive dark patterns modify the set of choices (decision space) available to users.23 I return to these strategies for influencing user behaviour, and discuss why they might work, in section 3.2.2.

By illustrating how the types of dark patterns identified in prior literature and policy documents can be described in terms of their attributes, Mathur et al. (2021) show how these themes (the modification of information flow or decision space) can be used to bring the current dark patterns scholarship together into a coherent whole. They propose that in order to add coherence and consistency to this body of literature, dark patterns scholarship should be viewed as a set of interrelated thematic concepts.24 Some papers that have been published since then, and that introduce new types of dark patterns, have heeded researchers’ calls for greater consistency and have used Mathur et al.’s attributes.25

Alternative ways of systematising this body of literature have also emerged since 2021. In 2023, Gray et al. presented a working ontology of dark patterns, based on policy documents and academic literature.26 Their work was motivated by discrepancies in the terminology used by academics, policymakers and regulators to describe the same practices – often without adequate citation of past work – as well as the domain specificity of many dark pattern descriptions, which may make it more difficult to find cross-domain commonalities.27 The ontology maps different concepts referring to similar practices in 10 prominent taxonomies of dark patterns, introduces unified concepts, and proposes a hierarchical ordering of dark patterns into high-, meso- and low-level pattern types.28 High-level patterns reflect general strategies used to influence user behaviour, based on Gray et al.’s influential 2018 taxonomy of dark patterns29 (I discuss this in section 3.3). High-level strategies are context and platform agnostic. One such strategy is Sneaking, which refers to hiding, disguising or delaying the disclosure of important information to the user. Meso-level patterns describe a specific, context-agnostic angle of attack. An example of a Sneaking meso-level dark pattern is Hiding Information, which refers to disclosing relevant information to the user at a late stage, potentially preventing them from making a different choice. Lastly, the term low-level dark patterns refers to specific means of execution that may influence users and are described in visual and/or temporal forms. Hidden Costs, which entails disclosing charges to the users late in the purchasing process and is used in contexts such as e-commerce, is an example of a Hiding Information strategy.30

I explain in sub-section C below, which defines dark patterns for the purposes of my investigation, how I engage with these attempts to systematise dark pattern scholarship. To sum up the discussion so far: new dark patterns and new definitions, as well as ways to systematise current descriptions, are constantly emerging. It is unlikely that a definition, taxonomy or attempt to systematise taxonomies will reign supreme for long in light of the ongoing innovation in digital architectures.31 Regulating dark patterns well is a tall order for policymakers, who have to deal with a moving regulatory target. I return to this issue in Chapter 4.

B. Evolving terminology

Another relatively recent phenomenon that is liable to further decrease conceptual cohesion in the field is the objection of some scholars to the use of the term ‘dark’ to depict the maliciousness of dark patterns; this is based on the exclusionary effects of the duality of light/dark as representing good/bad. As Sinders points out, discussions of decolonisation and identity politics were in a very different place in 2010, when the term ‘dark patterns’ was coined, from where they are today.32 There are also researchers and practitioners who argue that the term ‘dark’ is not sufficiently descriptive to convey the practices it refers to, and may be confusing, as dark patterns can involve the use of bright colours.33

Against this background, Harry Brignull has switched to using the term ‘deceptive design’ and renamed his website and Twitter/X account denouncing the use of dark patterns by companies accordingly. The Web Foundation34 and Mozilla Foundation35 have followed in his footsteps. Kat Zhou, a Spotify product designer who advocates ethical design,36 has instead started using the term ‘anti-patterns’.37 Neither term is ideal, however.

The use of the term ‘deceptive’ in relation to practices that researchers have, in the past, identified as dark patterns is problematic for two reasons. First, deceptiveness carries a normative weight, as it has a distinct, established legal meaning in various fields of law,38 and to equate dark patterns with deceptive patterns may imply that they can be deemed clearly illegal, whereas that assessment depends on the dark pattern in question and the relevant legal system. This definitional fallacy in policy-oriented research (read as: research attempting to influence policy) has been pointed out in the context of advertising research by Richards, who noted that ‘a study can have both internal validity and external validity, yet still have no legal validity’ when legal terms are adopted in non-legal literature.39 Second, as discussed in the previous section, deceptiveness is not a property of all dark patterns.

Likewise, the use of the term ‘anti-patterns’ in this context does not lead to more conceptual clarity. As discussed in the chapter introduction, ‘anti-patterns’ is also a term of art in the HCI field, and it is usually used to describe negligent bad design practices. While the question of whether an element of intention is required for a practice to amount to a dark pattern is still disputed in the literature, referring to anti-patterns instead of dark patterns does not solve this debate, as it implies that the development of dark patterns is always incidental in the process of UI design.

Changing terminology can lead to less understanding of which kinds of design choices are deemed problematic in the HCI community and why. To illustrate – on 12 September 2022, I posted a tweet denouncing Amazon’s use of Pressured Selling (a Mathur et al. (2019) dark pattern which entails urging users to buy more products)40 in its checkout process to pressure users into signing up for Prime subscriptions. The tweet was retweeted by the @DeceptiveDesign account, asking followers whether the design choice in question is ‘deceptive or “just” manipulative’.41 The question as posed by @DeceptiveDesign reflects the general disconnect between the study of dark patterns and normative considerations noted in the previous sub-section. It implies that dark patterns that are not deceptive may be less problematic or not problematic at all, whereas philosophers typically regard deception as just one type of manipulation, meaning that deceptive (and other practices) are problematic from a philosophical viewpoint because they are manipulative.42 As will be discussed in greater detail in Chapter 5, EU consumer law also acknowledges that commercial practices that are not based on the transmission of information (such as deceptive practices) may be unfair to consumers and therefore unlawful.

A further objection to moving away from the term dark pattern stems from the fact that regulators on both sides of the Atlantic have, as of late, been listening intently to researchers who are concerned about the spread of dark patterns, and have codified the term: the text of the recently adopted Digital Services Act (DSA) in the EU refers to the term ‘dark patterns’ in its Preamble,43 and the 2018 California Consumer Privacy Act (CCPA) was amended in 2020 to specify that consent to personal data collection attained by using dark patterns is invalid.44

C. Dark patterns in this study

All in all, the discussion above illustrates that it is necessary for a study investigating potentially problematic UI design practices to, first of all, choose a term with which to refer to them. As ‘dark patterns’ is now a legal term in EU law, and given its popularity in prior scholarship, this study will continue to use it, while acknowledging that there is a need to address its exclusionary effects, as well as the fact that it is not descriptively accurate; the quest for alternative terminology in HCI scholarship should continue.

It is also clear that it is a methodological necessity to define what dark patterns are for the purpose of a given study. For the purpose of this investigation, following Gunawan et al., the term ‘dark pattern’ will be used to refer to specific types of ‘dark’ UI designs documented in prior scholarship.45 Further, in light of the discussion in Chapter 2 of the technological, economic and organisational considerations behind UI design,46 I do not consider intention a necessary criterion for a practice to be considered a ‘dark pattern’ – a position that contrasts with those of many academic efforts to define dark patterns47 – while at the same time acknowledging that some dark patterns may be designed intentionally. There are two possible issues with this approach. First, the distinction between anti-patterns and dark patterns drawn in prior studies such as Bosch et al.’s might collapse.48 Second, this pragmatic approach may fail to specify what is problematic about dark patterns; this is because past scholarship has varied in the degree to which it has engaged with normative considerations. This engagement is necessary if we are to communicate to policymakers what exactly is problematic about dark patterns. To address these potential drawbacks and to bridge policy and HCI discussions, I approach dark patterns from a regulation theory perspective in Chapter 4. On this view, what we term a practice is not as important as the harms it could lead to, which are specified by regulation theory.

This does not solve the problem of conceptual inconsistency, however. As we saw in sub-section A, researchers (and policymakers) have come up with various ways to refer to the same kinds of practices, leading to conceptual inconsistency which can impede translational efforts, both across discrete studies and in the dialogue between scholars and regulators. To promote conceptual coherence, as well as to add generalisability to the discussions in the following chapters, I will, where relevant, refer to Mathur et al.’s (2021) approach in referring to dark pattern attributes. As an additional measure to ensure conceptual consistency, in section 3.3.2 I discuss the Brignull typology of dark patterns and Gray et al.’s (2018) taxonomy49 (which Gray et al.’s (2023) ontology is based on)50 in relation to the dark patterns I focus on in my study, Mathur et al.’s (2019)51 Shopping dark patterns.

3.2.2. How do dark patterns work?

As seen in the previous section, dark patterns attempt to influence users of digital services by modifying the choice architecture; accordingly, the UK Competition and Markets Authority (CMA) refers to dark patterns as instances of ‘online choice architecture’.52

Minor and seemingly trivial features of choice architecture, whether that architecture be digital or not, can impact our decisions.53 This is because we are imperfect decision-makers. Our bounded rationality54 is often explained by reference to the dual process theory, which describes decision-making as a result of two interconnected cognitive systems.55 System 2 thinking is conscious and deliberative, but is also slow and effortful.56 System 1 thinking, on the other hand, is intuitive, emotional, fast and automatic.57 This dual process model is largely built on Kahneman and Tversky’s Nobel Prize–winning experimental work58 on heuristics – mental shortcuts we follow to arrive at decisions,59 and cognitive biases – systematic and predictable deviations from rational decision-making that we are prone to making at some points in the decision-making process.60

System 1 thinking pervades our decision-making processes: we are faced with the need to take decisions every moment of every day, yet attention is a limited resource, so most decisions are based on automatic assessments.61 At the same time, System 1 relies on heuristics, and its cognitive devices can lead it to biased appraisals.62 These biased appraisals can be corrected by System 2, but for the most part are not, due to the effort that this would require.63 Since Kahneman and Tversky published their seminal paper on heuristics and biases in 1974,64 behavioural scientists have uncovered numerous cognitive biases that affect our decision-making; I provide some examples of these when discussing dark patterns below.

Choice architects – in our case, online businesses – can leverage this knowledge when designing online choice architectures in order to steer our decisions in a particular direction. As the UK CMA points out, ‘choice architecture’ is a neutral term in that it does not necessarily imply that the architecture may be of disservice to its users.65 Well-designed choice architecture can ensure that we are able to navigate digital spaces with intuitive ease;66 in other words, choice architecture can promote usability.67 Choice architecture can, however, also seek to weaponise our behavioural biases against us to promote decisions that serve a business’s interest. This is what many dark patterns seek to do. Several dark patterns studies and policy documents have drawn an explicit link between particular dark patterns and behavioural biases.68 Dark patterns may target well-known behavioural biases such as the default effect (our tendency to stick with defaults), the scarcity bias (our tendency to value scarce things more) and the bandwagon effect (our tendency to view something as more valuable because someone else sees it as such)69 amongst others.70 Dark patterns may also leverage the unique ways in which we interact with information online: researchers have shown that we interact with digital interfaces in a task-oriented way, which may lead us to ignore certain kinds of content;71 that we may be prone to skim – rather than read – online text; and that we may act more quickly.72

However, the fact that dark patterns try to influence us by modifying the online choice architecture and by exploiting behavioural biases still says nothing about either the success of these techniques or their harmfulness. I return to these considerations in sections 3.4 and 3.5, respectively. Let us now look at what kinds of platforms use dark patterns, and for what purposes.

3.3. Dark patterns in e-commerce and beyond

As noted in the chapter introduction, since Brignull coined the term ‘dark pattern’, a multitude of studies documenting the use of dark patterns by digital businesses and proposing dark pattern taxonomies have been published. This section starts in sub-section 3.3.1 with a discussion, based on a review of HCI literature, of what kinds of businesses have been found to use dark patterns, and for what purposes they have done so.73 The main goals of this discussion are to illustrate the breadth of the dark patterns problem, which may pose issues for regulators, and to delineate the scope of my study. Sub-section 3.3.2 then introduces the dark patterns my study is concerned with: the Shopping dark patterns identified by Mathur et al. (2019).74 In order to promote conceptual consistency, it delineates the attributes of these practices and their relationship to other prominent dark pattern taxonomies (i.e. Brignull’s darkpatterns.org75 and Gray et al.’s (2018)),76 followed by a discussion of the potential limitations stemming from the narrow focus of my analysis.

3.3.1. The dark corners of digital markets

Based on their aims, studies documenting and categorising dark patterns found on digital platforms can be roughly divided into three categories: general purpose, purpose driven and platform driven. General-purpose studies do not distinguish between different dark pattern aims, nor the platforms on which they are present, but rather discuss instances of manipulative design that have been found ‘in the wild’ or at a high level. The goal of purpose-driven studies is to investigate dark patterns that have a specific aim, e.g. the collection or retention of personal data, without reference to a specific digital environment. Platform-driven studies research dark patterns in a given digital environment or on several platforms; the dark patterns found by these studies can therefore serve several purposes.

A. General-purpose studies

The first general-purpose dark pattern taxonomy was created by Brignull. The original dark pattern examples were sourced from various websites he had visited. Looking at the practices included in the 2011 and 2022 lists of dark patterns on darkpatterns.org reveals that Brignull’s taxonomy has remained largely unchanged over the years, with a few exceptions. These include some denomination changes (e.g. Silent Credit Card Roll-Over, which entails requiring a credit card for a free trial, has been renamed Forced Continuity); the removal of Forced Disclosure – i.e. requiring unnecessary registration on a website – from the list (this is most likely subsumed under Privacy Zuckering now, which, as defined by Brignull, refers broadly to techniques used to maximise data collection); and the addition of Confirmshaming (using language to guilt the user into a certain action) and Misdirection (directing users’ attention to some interface element).77 Brignull’s typology of dark patterns includes General Usage dark patterns,78 i.e. those that may be used for various purposes (such as Bait and Switch, which broadly means that an intended user action leads to an undesirable result, but also Misdirection). It also contains Shopping dark patterns, such as the practice of ‘dripping costs’ throughout the checkout process on a website (Hidden Costs) or adding unsolicited products to a virtual cart (Sneak into Basket), and Privacy dark patterns that may induce users to share (more) personal data (e.g. Privacy Zuckering).

A concurrent work by Conti and Sobiesk published in 2010, before the term ‘dark pattern’ entered common UI design parlance, catalogued types of ‘malicious interface design’ that were reported by students and participants at a hacker conference based on their use of websites, desktop software and interfaces off the desktop (e.g. gas station pumps, arcade games).79 Their taxonomy contains overlaps with Brignull’s (e.g. Distraction, which entails drawing the user’s attention away from an ongoing task, describes the same tactics as Misdirection). At the same time, their paper contains very few details on how the practices were sourced, and no examples, which might explain its limited uptake in the studies that followed. Further, some of the described practices, e.g. Silent/Invisible Behaviour, which entails installing unsolicited software on a user’s device, refer to the exploitation of security rather than user vulnerabilities (possibly due to the researchers having interviewed hackers as part of their data collection strategy).

Gray et al.’s 2018 paper represents the first attempt to systematise the study of dark patterns across platforms.80 The researchers used Brignull’s taxonomy as a starting point, extended it with additional examples of dark patterns sourced from social media websites, technology news websites and UX practitioner blogs, and collapsed several patterns into high-level categories based on strategies and potential designer motivations.81 Gray et al.’s (2018) taxonomy, due to its platform- and purpose-neutrality, has been popular amongst researchers seeking to identify dark patterns on specific platforms, as the next section shows. In light of the foundational nature of both Brignull’s and Gray et al.’s (2018) taxonomies, Table 1 is provided below in order to provide context for the studies that will be discussed hereafter and promote conceptual consistency. The table describes the dark patterns found in Gray et al.’s (2018) taxonomy and how they relate to the Brignull typology.

Gray et al. category

Description

Gray et al. sub-category

Description

Nagging

Redirection of expected functionality that persists beyond one or more interactions

NA

NA

Obstruction

Making a process more difficult than it needs to be, with the intent of dissuading certain action(s)

Roach Motel (Brignull)

A situation that is easy to get into, but difficult to get out of

Price Comparison Prevention (Brignull)

Making direct price comparisons between products difficult

Intermediate Currency

The use of a virtual currency

Sneaking

Attempting to hide, disguise or delay the divulging of information that is relevant to the user

Forced Continuity (Brignull)

Charging users after the service they have purchased expires

Hidden Costs (Brignull)

Late disclosure of certain costs

Sneak into Basket (Brignull)

Adding items not chosen by the user to their online shopping cart

Bait and Switch (Brignull)

Making it appear as if a certain action will cause a certain result, only to have it cause a different, likely undesired result

Interface Interference

Manipulation of the user interface that privileges certain actions over others

Hidden Information

Options or actions relevant to the user are not made immediately or readily accessible

Preselection

Any situation where an option is selected by default prior to user interaction

Aesthetic Manipulation

Any manipulation of the user interface that deals more directly with form than function. This refers to Brignull’s Misdirection. The researchers identified four specific forms of Aesthetic Manipulation:

- Toying with Emotion: any use of language, style, colour or similar elements to evoke an emotion to persuade the user into a certain action;

- False Hierarchy: giving one or more options visual precedence over others;

- Disguised Ad (Brignull): ads disguised as different content

- Trick Questions (Brignull): a question that appears to be one thing but is actually another, or uses confusing wording, double negatives or otherwise leading language

Forced Action

Requiring the user to perform a certain action to access (or continue to access) certain functionality

Social Pyramid

Requiring users to recruit other users to use the service

Privacy Zuckering (Brignull)

Tricking users into sharing more information about themselves than they intend to or would agree to

Gamification

Certain aspects of the service can only be ‘earned’ through repeated (and perhaps undesired) use of aspects of the service

Table 1. The Gray et al. (2018) taxonomy of dark patterns and its relationship to the Brignull typology82

B. Purpose-driven studies

Purpose-driven studies documenting dark patterns have so far investigated Privacy and Engagement dark patterns (tactics to get users to spend more time using a digital product), without reference to specific digital environments.

Bosch et al. created a taxonomy of Privacy dark patterns in 2016.83 Their taxonomy is based on Hoepman’s privacy design strategies;84 for every privacy strategy, they conceptualised corresponding Privacy dark patterns, and provided examples from popular online services. With regard to Engagement dark patterns, the first work investigating these practices was published in 2022. The researchers self-monitored their mobile and web interactions with Facebook and YouTube, and observed five Engagement dark patterns in the course of this experience.85 In a follow-up study, based on a systematic literature review of 43 HCI publications, the same research team proposed a taxonomy of 11 Engagement dark patterns, which they refer to as ‘Attention Design Capture Patterns’.86

C. Platform-driven studies

As the chapter introduction notes, dark patterns have been studied in the context of various digital platforms – video games, privacy notices, shopping websites, mobile apps, social media and video-streaming platforms – and researchers have also compared the use of dark patterns across platforms.

i. Video games

The first platform-driven inquiry concerned dark patterns in video games, with Zagal et al.’s study paving the way.87 The researchers grouped the practices they identified in several ‘contemporary games’ (the study was published in 2013, and at least some of the games have since become unavailable) into three broad categories according to their aims: making a player spend more or less time playing (Temporal), extracting money from players (Monetary) and exploiting a player’s social capital (Social Capital), a category that includes Brignull’s Friend Spam (which is a Privacy dark pattern) and Social Pyramid Schemes, an Engagement dark pattern that gets users to invite their social contacts to a game so they will feel obliged to continue playing themselves, but also to increase the user base. The practices the researchers categorised as Monetary dark patterns are what I refer to as Shopping dark patterns, and I will return to them in section 3.3.2.

More recently, researchers have started looking at dark patterns in children’s mobile games. Fitton and Read developed a framework based on Zagal et al.’s taxonomy for assessing the dark-design aspects of free-to-play children’s mobile games, which they supplemented with instances of age-inappropriate designs.88

ii. Privacy notices

Privacy dark patterns in GDPR-mandated privacy notices have received a lot of attention from scholars in recent years. Utz et al. manually inspected 1000 consent notices found on popular websites and found that the majority of the analysed banners used dark patterns and were likely not compliant with the GDPR.89 In the same vein, Nouwens et al. scraped and manually investigated the cookie banners provided by the top five Consent Management Platforms (CMPs) on 10,000 websites popular in the UK.90 They found that dark patterns were prevalent, and only 11.8% of the 680 successfully scraped websites complied with the minimum requirements under EU data protection law. A study by Soe et al. investigated the prevalence of dark patterns in cookie banners used by news websites.91 As this industry heavily relies on advertising in its business model, there is a high level of incentive to collect user data. The authors found that approximately 200 of the 300 Scandinavian- and English-language news websites analysed had at least one dark pattern from the Gray et al. (2018) typology. The researchers further refined the Gray et al. categories of dark patterns based on how they manifest in cookie notices. Kyi et al. analysed 474 websites that included ‘legitimate interest’ as a ground for personal data processing in their privacy notices, and found various Bosch et al. and Brignull dark patterns in most of the studied notices.92

iii. Shopping websites

Moser et al.’s study was the first to look at design practices on shopping websites.93 Their investigation focused on practices that may promote impulse buying, which they define as ‘unplanned buying with little deliberation’. They found 64 such practices by manually investigating 200 shopping websites. Their study does not, however, consider any dividing lines between practices that merely facilitate online shopping –such as withdrawal periods, which in some jurisdictions, like the EU, are mandatory, or online reviews, which are essential for building trust in a distance-contracting setting94 – and those that may be against the user’s interest. Nevertheless, it could serve as a good starting point for a discussion about how such features could affect particularly vulnerable users, e.g. compulsive shoppers, and the extent of the retailers’ responsibility with regard to such online customers.95

Mathur et al. (2019) performed the first – and, to date, the only – large-scale measurement of dark pattern prevalence on popular e-commerce websites.96 Their study is discussed in detail in section 3.3.2 below.

iv. Mobile apps

Di Geronimo et al.’s study was the first to investigate the prevalence of dark patterns in mobile apps.97 The researchers screen-recorded five minutes of interaction with 240 popular Android apps and manually annotated the dark patterns they encountered using Gray et al.’s (2018) taxonomy. They found that 95% of the studied apps had one or more dark patterns.

More recently, Radesky et al. investigated the use of dark patterns in mobile apps used by children.98 The researchers found four types of Engagement and Shopping dark patterns; only 20% of the apps had no dark patterns.

v. Social media platforms

Mathur et al. (2018)99 were the first to look at dark patterns on social media. They sampled approximately 500,000 YouTube videos and 2.1 million Pinterest pins to investigate the prevalence of undisclosed affiliate links (a subtype of Brignull’s Disguised Ads). Their findings revealed that only around 10% of affiliate marketing content on both platforms contains any disclosures.

Mildner and Savino zoomed in on Facebook’s use of Privacy dark patterns, and found two examples of Gray et al.’s (2018) Interface Interference.100

Schaffner et al. investigated dark patterns found in the social media account-deletion process.101 The researchers created and attempted to delete accounts on the mobile app, mobile browser and desktop browser versions of 20 popular US social media platforms, and documented the Brignull, Gray et al. (2018) and Bosch et al. dark patterns they found in this process. Their study shows that some platforms employ dark patterns in both their account-deletion user interfaces and their account-deletion policies regarding data retention.

Mildner et al. screen-recorded their use of the Facebook, Twitter, TikTok and Instagram mobile apps on iOS and Android devices, and analysed the recordings for the presence of dark patterns identified in prior studies.102 They found 41 different types of dark patterns on Facebook, 39 on Instagram, 35 on Twitter and 37 on TikTok. The researchers also described two new Engagement dark patterns, and proposed a new high-level category, Governing, which refers to dark patterns that are designed to control or govern user behaviour.

vi. Video-streaming platforms

Chaudhary et al. proposed a taxonomy of Engagement dark patterns that they encountered on four popular video-streaming platforms – Netflix, Amazon Prime Video, YouTube and Disney Plus.103

vii. Cross-platform comparisons

Gunawan et al. build upon Di Geronimo et al.’s study by comparing the use of dark patterns across mobile app-, mobile browser- and web browser-based versions of 105 popular online services.104 Their study was motivated by the fact that these modalities have different affordances, capabilities and design norms. The codebook they used was based on the Di Geronimo et al. library of dark patterns, and supplemented with additional practices from Mathur et al.’s (2019) taxonomy of Shopping dark patterns, as well as a selection of Privacy dark patterns. During their preliminary examination of services, they found 12 new dark patterns, bringing the total number of practices they investigated to 50. They found that all of the services in their corpus included at least one type of dark pattern, and the majority included seven or more unique dark pattern types. Dark pattern usage frequently differed across the versions of a service, with apps using the most unique dark patterns, which are different from those used on websites.

D. Section summary

The use of a multitude of dark patterns relying on a variety of design vectors (visual, interactive, textual, linguistic) in many digital environments is flourishing. HCI researchers have shown that dark patterns may be used for three main commercial purposes: to induce users to share more personal data (Privacy), to spend more money (Shopping) and to spend more time using a service (Engagement). The last purpose, Engagement, reinforces the former two: users who spend more time using a service may share more personal data, see more (personalised) ads and spend more money.105 The types of dark patterns used will likely depend on the digital business model.106 For example, using Privacy and Engagement dark patterns may be profitable in the context of ‘free’ digital services, such as social networking. The link between various types of dark patterns and business models is under-explored in the literature to date.107 We have evidence, however, of the kinds of practices e-commerce websites, which my investigation focuses on, rely on, to which I now turn.

3.3.2. Shopping dark patterns

This study is concerned with UI design practices that aim to influence users’ purchasing decisions: i.e. the Shopping dark patterns that have been found by Mathur et al. (2019) on e-commerce websites.108

Mathur et al. (2019) performed the first, and – to date, the only – large-scale study of dark pattern prevalence on e-commerce websites. The researchers found a total of 15 dark patterns split across seven broader categories on over 11,000 e-commerce websites. This was achieved by automating users’ primary interaction path with shopping websites: making a purchase, and collecting data in this process.109 To pinpoint dark patterns in this data, the researchers relied on prior literature on dark patterns and impulse buying, and media coverage of high-pressure sales and marketing tactics.110

Table 2 describes the Shopping dark patterns the researchers found, explains how they relate to the Gray et al. (2018) and the original Brignull categories of dark patterns and delineates the functional attributes of these practices in order to facilitate conceptual consistency and add generalisability to my analysis.

Mathur et al. (2019) category

Description

Mathur et al. (2019) sub-type

Description

Gray et al. (2018)/

Brignull category

Attributes

Urgency

Imposing a deadline on a sale

Countdown Timers

Dynamic indicators of a deadline

NA

Partially deceptive, partially covert

Limited-time Messages

Static urgency messages not indicating a deadline

NA

Information-hiding, partially covert

Scarcity

Signalling the limited availability of or high demand for a product

Low-stock Messages

Messages indicating limited product quantities

NA

Partially covert, partially deceptive, partially information-hiding

High-demand Messages

Messages indicating high demand for a product

NA

Partially covert

Social Proof

Indicating other users’ activities and experiences shopping for products and items

Testimonials of Uncertain Origin

Customer testimonials whose sources are not disclosed or clearly specified

NA

Partially deceptive

Activity Notifications

Static or dynamic, sometimes recurring messages about other users’ activity (e.g. viewing a product, adding it to their cart or purchasing it)

NA

Partially deceptive, partially covert

Sneaking

Misrepresenting user actions or hiding/delaying relevant information

Sneak into Basket

Adding additional products to users’ shopping carts without their consent

Sneak into Basket

Information-hiding, partially deceptive

Hidden Costs

Late disclosure of certain costs

Hidden Costs

Information-hiding, partially deceptive

Hidden Subscription

Charging users a recurring fee under the pretence of a one-time fee or a free trial

Forced Continuity

Information-hiding, partially deceptive

Obstruction

Making a certain course of action hard to pursue

Hard to Cancel

Making cancelling a service harder than signing up for it

Roach Motel

Restrictive, partially information-hiding

Misdirection

Using visuals, language and emotion to steer users towards or away from a particular choice

Pressured Selling

Pre-selecting or pressuring the user to accept more expensive variations of a product and related products

Overlaps with Gray et al.’s Aesthetic Manipulation and Brignull’s Misdirection

Partially asymmetric, partially covert

Confirmshaming

Using language and emotion to steer users away from a certain choice

Overlaps with Gray et al.’s Toying with Emotions, a broader category of linguistic and visual elements; Brignull’s taxonomy includes Confirmshaming

Asymmetric

Table 2. Mathur et al. Shopping dark patterns and their attributes, and their relationship to the Brignull and Gray et al. (2018) taxonomies

Several dark patterns that Mathur et al. (2019) found are excluded from both the overview in Table 2 and my analysis in future chapters. Visual Interference is a residual category in the Mathur et al. (2019) taxonomy, and is therefore excluded from my analysis. The researchers did not set out to include style information in their analysis, but captured it during the textual analysis of the other dark patterns they found. Instead of focusing on the handful of instances of Visual Interference affecting other dark patterns that Mathur et al. (2019) identified, my analysis will account for the fact that dark patterns may generally also exploit visual user interface design vectors (e.g. colour, dynamicity, size, positioning). I completely exclude Forced Enrollment and Trick Questions from my analysis because Mathur et al. (2019) found that the main purpose of these practices (on the analysed websites) was to facilitate the collection of personal data, whereas I am concerned with Shopping dark patterns. For now, it suffices to say that e-commerce websites may use other kinds of dark patterns, which may lead to different kinds of harms; while these definitely merit attention from researchers, they are beyond the scope of my investigation.

Further, there may be other types of Shopping dark patterns on e-commerce websites. To find dark patterns on e-commerce websites, Mathur et al. (2019) automated the primary interaction path of websites (making a product purchase), and extracted and analysed textual interface elements present in this path. This approach has some limitations. First, they were only able to look at textual information, whereas dark patterns may use non-textual features. Second, they only crawled product and checkout pages, whereas dark patterns may also be present on other types of pages (e.g. home pages, product search result pages). Third, the researchers did not make any purchases, whereas some dark patterns may be used post-purchase. Accordingly, their findings represent a lower bound on the total number of dark patterns on the analysed websites.111

While the scope of my in-depth analysis in future chapters is relatively narrow, in light of its focus on the Shopping dark patterns found by Mathur et al. (2019) on e-commerce websites, I compensate for this by referring to dark pattern attributes, and formulating a relatively technologically neutral theoretical framework in Chapter 4, which could be applied to other types of dark patterns. Further, Shopping dark patterns may be found on other kinds of platforms. As Rieger and Sinders note, dark patterns that target users’ purchasing decisions can be found in any digital environment where purchases are made or contracts are concluded.112 This includes both accommodation-booking and airline websites. Both of these sectors are characterised by obscure dynamic pricing strategies, which have led to numerous urban myths about what the best time or strategy for booking a service is (for example, the belief that Tuesday is the best day to book a flight).113 Given consumers’ concern with scoring the best price for a hotel room or flight, as well as the limited supply of these products, it makes sense for such websites to implement dark patterns playing on the user’s fear of missing out or hiding the real cost of the service.114 Indeed, the Ryanair website was the one that inspired Brignull to dig deeper into manipulative design practices.115 Other sectors in which Shopping dark patterns have been noted to occur are news websites (which may make it difficult to cancel subscriptions) and domain-name registrars.116 Further, as noted in the previous sub-section, video-game designers have devised their own Shopping dark patterns. My conclusions with regard to Shopping dark patterns may therefore have relevance outside of the e-commerce sector.

3.4. User reactions to Shopping dark patterns

While it is not unreasonable to assume that companies deploy dark patterns because they work,117 we now also have some incipient evidence of this being the case.
Luguri and Strahilevitz conducted the first large-scale controlled experiments assessing the effectiveness of Shopping dark patterns.118 Their first experiment119 consisted of telling the participants of an online survey on privacy preferences that they had been signed up for an identity-theft protection plan. The experimental manipulation varied the types of acts that consumers needed to carry out in order to accept/decline the plan – some participants were exposed to ‘mild’ dark patterns, others to ‘aggressive’ patterns and the control group to none. When mild dark patterns were deployed, the acceptance rate more than doubled, and the aggressive dark pattern condition nearly quadrupled the acceptance rate. They also found that variations in the price of the product had no significant effects on the rates of acceptance, and that the participants in the aggressive condition were significantly more likely to express anger. The researchers also conducted a second experiment120 using the same setup, in order to isolate the effect of different dark patterns. The participants were exposed to different 'content' (offer framing) and 'form' (how it could be accepted/declined) conditions; 50% of the participants were assigned to a Trick Question asking them to confirm their decision to purchase or not purchase the service. They found that the most effective dark pattern strategies were Hidden Information, Obstruction, Trick Questions and Activity Notifications; Confirmshaming and Preselection were also found to be effective. Countdown Timers and Pressured Selling did not significantly increase purchases. These findings are consistent with those of a recent study commissioned by the European Commission, which found that Hidden Information and Toying with Emotion both significantly increased the likelihood of a consumer’s not acting in accordance with their own stated preferences.121

Participants in Luguri and Strahilevitz’s second experiment were also asked about their mood and exposed to different price variations. Participants exposed to the most effective content condition – Hidden Information – were in a significantly better mood than those in the control group, which in the opinion of the researchers might suggest that consumers might misunderstand the terms of a contract they entered into. As in the first experiment, price variations did not significantly alter consumers’ decisions. Overall, based on their findings, the researchers conclude that ‘dark patterns are strikingly effective in getting consumers to do what they would not do when confronted with more neutral user interfaces’.122

Another recent study by Sin et al. set out to investigate the effectiveness of Social Proof (Testimonials) and Scarcity (Low-stock/High-demand Messages) dark patterns in a hypothetical single-product online shopping scenario.123 The researchers showed participants variations of the product page containing these practices, and asked them to rate their urge to purchase the product. They found that all the tested dark patterns had an impact on increasing purchase impulsivity, albeit a small effect size. It should be noted, however, that dark patterns literature does not take issue with Testimonials per se, but rather Testimonials of Uncertain Origin, i.e. those for which a source cannot be identified (see Table 2 above). The authors use a review that is marked as a ‘Verified purchase’ on Amazon in one of their treatment conditions. Their findings in this regard are therefore better read as attesting the effectiveness of positive customer reviews – a commercial practice that is not typically viewed as problematic – rather than that of a dark pattern.

Motivated by Luguri and Strahilevitz’s findings with regard to their participants’ mood, Di Geronimo et al. hypothesised that users may not be aware of dark patterns (they term this ‘dark pattern blindness’), especially in mild cases.124 To study this, they set up an online experiment in the form of an online survey where participants watched videos of app usage, either involving the use of dark patterns or not.125 The participants were asked whether they noticed any instances of malicious design, which they defined as ‘user interfaces crafted to trick the users in [sic] doing things they do not want to do, or try to manipulate the user in some way’. If the users answered positively or were not sure, they were also asked whether the dark patterns present in the app were the instances of malicious design they had noted. Their results led Di Geronimo et al. to conclude that users are generally not aware of and cannot correctly recognise dark patterns. A 2022 study by Keleher et al. similarly found that end users often do not recognise dark patterns, even after being presented with a definition of them.126 A study by Bongard-Blanchy et al. investigated whether users’ awareness of dark patterns renders them less likely to be influenced by them, using examples of Shopping, Privacy and Engagement dark patterns. They found that awareness had no significant effect on the (self-reported) likelihood that users would ‘fall’ for the dark patterns.127 The findings of studies relying on self-reported measures like Bongard-Blanchy et al. may be overestimating users’ ability to recognise dark patterns. Keleher et al.’s study also conducted online surveys with experts to investigate how well experts understand end users’ perceptions of dark patterns. They found that users’ descriptions of dark patterns were significantly more positive than experts assumed.128

Dark pattern user studies have some other limitations. In the context of their study of users’ account-deletion experiences and perceptions, Schaffner et al. highlight that there may be ethical reasons not to expose users to some practices.129 Further, the researchers point out that some real-life scenarios are harder to reproduce in an experimental setting than others (with regard to account deletion, they note that participants’ reactions may not be authentic when their actual accounts are not at stake).130 They also note that even ‘gold standard’ experiments, like those conducted by Luguri and Strahilevitz, have their limitations in terms of their ability to replicate real-life user behaviour.131 As Luguri and Strahilevitz themselves point out, the participants in their study were told that they had already been enrolled for a fictitious identity-theft protection service, and that they would not be charged for the first six months; their results therefore likely overestimate demand for a service like the one tested.132 Further, it is well known that not all people are always affected by the same biases to the same extent.133 Therefore, context matters when it comes to interpreting the results of a behavioural study. For example, the effect of some dark patterns may be product- and context-specific. Luguri and Strahilevitz found that the effect of Countdown Timers (which attempt to exploit the scarcity bias) was negligible in the context of their fake identity-fraud insurance plan.134 Other studies show, however, that Scarcity dark patterns may be effective with regard to products that have a limited supply (e.g. hotel rooms in a particular city for a particular date) and when businesses use generous cancellation policies.135 Further, individual user characteristics may make some users more susceptible to dark patterns. Some of the studies discussed in this section have investigated how the effect of dark patterns or users’ awareness thereof may vary across different consumer categories. Luguri and Strahilevitz found that users with a lower education level were significantly more susceptible to mild dark patterns.136 Di Geronimo et al. did not find any significant differences in ability to detect dark patterns amongst users of a different age, education level (although they note that their participants had a relatively high education level, with 23% holding doctoral degrees) and employment status.137 Bongard-Blanchy found that younger users and users who had completed at least the first stage of university-level education were more likely to detect dark patterns, yet just as likely to be influenced by the design as older or less educated users.138 The behavioural study commissioned by the Commission found that older participants and those with lower education levels were more likely to be affected by dark patterns.139 Something that is more difficult for a user study to gauge is situational vulnerability – (some) people may (temporarily) be (more) vulnerable in particular contexts or situations. For example, studies have found that life events like divorce can provoke a ‘scarcity mindset’, i.e. the experience of insufficient resources such as time, money or food, leaving consumers temporarily more vulnerable to practices that exploit the scarcity bias.140 The Commission’s behavioural study placed half of the participants under time pressure and the others in a state of motivated delay in order to assess the effects of situational vulnerability on users’ choices; (situationally) vulnerable consumers were found to be more likely to make inconsistent choices when exposed to dark patterns.141

To sum up, here is what we know so far. Some Shopping dark patterns are highly effective in influencing user decisions; there is some evidence suggesting that users are not able to flag the use of dark patterns, and that even when they can, they could still be influenced by UI design. Further, some users, such as older or less educated ones, may be more likely to be affected by dark patterns and less able to detect them, and users may also be temporarily more susceptible to dark patterns under certain circumstances. I discuss what these insights mean for consumer policy in the following chapters.

3.5. Summary and next steps

Prior studies show that the dark side of the digital economy is thriving and growing. Dark patterns are being used across a wide array of digital services for various purposes. They can also influence user behaviour. It may be time for regulators to step in.

This chapter has already hinted at several problems that regulators might encounter when trying to regulate the use of dark patterns. As section 3.2 shows, dark patterns are both conceptually and terminologically fluid. I address this concern by adopting a pragmatic and narrow approach in my investigation: for the purposes of this study, ‘dark patterns’ are practices that HCI researchers have termed as such in prior studies, and the dark patterns I analyse in depth are the Shopping dark patterns found by Mathur et al. (2019) on e-commerce websites, discussed in section 3.3.2. The question of how to define dark patterns for the purpose of regulating their use is still open, however. Further, as section 3.3 shows, a large, varied and constantly evolving range of dark patterns are being used on digital platforms, including on e-commerce websites. Attempts to regulate dark patterns may need to account for both the breadth and the evolving nature of this phenomenon. Lastly, as noted in section 3.2, the discussion of dark patterns in HCI literature has been largely disconnected from normative discussions. While Mathur et al. (2021) have articulated several normative dimensions that HCI researchers could engage with in studying dark patterns to ensure that policymakers can act on their insights,142 their contribution approached this topic from a general perspective; it did not connect it to specific goals that policymakers, in particular those in the field of consumer protection, may seek to achieve, nor did it provide any directions for policy development. To fill this gap, the next chapter turns to regulation theory to discuss the normative concerns dark patterns may give rise to from the perspective of consumer policy, and to map ways in which regulators may address the difficulties associated with regulating these socio-technical artefacts.

Comments
0
comment
No comments here
Why not start the discussion?