Skip to main content
SearchLoginLogin or Signup

Chapter 1. Introduction

Published onDec 12, 2024
Chapter 1. Introduction
·

Consumers, by definition, include us all.

John F. Kennedy (1962)

1.1. Context: Market manipulation, then and now

Market manipulation1 is as old as markets themselves. The Romans knew that far from all of the Falernum wine sold by merchants and taverners was authentic – sometime in the first century AD, Pliny the Elder complained that not even the nobility were able to enjoy genuine wines any more.2 Where swindlers go, the law typically follows. Legal responses to manipulation in the market can be found even in the oldest recorded codifications of laws; the Code of Hammurabi (despite its flaws) prohibited tavern-keepers who did not accept payment in corn from overcharging customers for drinks in money (shekels), upon penalty of death by drowning.3 Again, the Code was not perfect, but it does serve to illustrate two points. First, if it is profitable to charge customers in shekels rather than the weight of a beer glass in corn due to the room for price distortion – or to, say, sell snake oil as a miraculous cure-all medicine4 – there will be market participants who will jump at the opportunity.5 Second, where and how the law draws the line between acceptable and unacceptable practices deployed by market participants is a hallmark of the times.

Naturally, in Hammurabi’s time (c. 1792–1750 BC), and even in the later times of the Roman Empire for that matter, markets were relatively small, local and on-premises. Today, the widespread use of the internet and connected personal devices like smartphones and computers means that businesses and consumers from across the globe can establish connections with each other within seconds. It also entails that many business-to-consumer transactions are mediated by technology. According to Eurostat data, 68% of consumers in the EU had shopped online in 2019, the year I started working on this book.6 Post-COVID-19 restrictions, this number had risen to 75% by 2022 across the EU, and was as high as 92% in the Netherlands.7

While technological progress has made shopping a lot more convenient, it has also enabled market manipulation to reach new heights. The digital age has brought with it a wide host of tools that can be used to influence consumers’ decisions on an unprecedented scale. Advances in computational data analysis entail that some businesses are able to engage in A/B testing,8 running thousands of consumers through user interfaces that are identical except for one feature,9 for the purpose of optimising the user experience. However, this process also enables them to establish which design choices may coerce, deceive or steer a consumer into making a purchase, thus maximising revenue. These user interface design choices are called dark patterns.10

Computer scientists specialising in Human–Computer Interaction (HCI) have been aware of traders’ use of dark patterns since 2010,11 and began to propose dark pattern taxonomies in the years since.12 With the release of Mathur et al.’s 2019 study13 measuring the prevalence of dark patterns, it also became clearer how widespread the use of dark patterns is, at least on shopping websites: using semi-automated measurement techniques, the researchers found ~1800 instances of dark patterns representing 15 types and seven broader categories on more than 11% of the ~11,000 of the most popular shopping websites worldwide. As the researchers themselves point out,14 these numbers represent a lower bound on the total number of dark patterns on e-commerce websites in light of the measurement techniques used.15 Indeed, subsequent studies paint a much more worrying picture. A 2022 study commissioned by the European Commission found that 97% of the most popular websites and apps in the EU used at least one dark pattern.16 A 2023 sweep of nearly 400 shopping websites found dark patterns on approximately 40% of the screened websites.17 While it would not be unreasonable to assume that businesses implement dark patterns because they work, recent user studies have produced evidence that some dark patterns are indeed effective at swaying users’ decisions in directions that benefit a business’s bottom line.18 There is also some evidence pointing to the magnitude of the consumer harms resulting from the use of dark patterns. For instance, the use of dark patterns to keep consumers trapped in auto-renewing online subscriptions they no longer want is estimated to cost consumers millions19 – and according to other estimates, billions – of euros on a yearly basis.20

Dark patterns are therefore a hallmark of today’s digital markets and there may be reasons for us to be concerned about their proliferation. But do they merit the attention of the legal system, and if so, what kind of attention? These are, in broad terms, the questions this work tackles.

These questions do not arise in a legal vacuum, however. The EU already has a system of protection in place to guard consumers against exploitative conduct by traders. The Unfair Commercial Practices Directive (UCPD)21 is a horizontal, maximum harmonisation, technologically neutral instrument that prohibits unfair commercial practices. The Consumer Rights Directive (CRD)22 imposes information disclosure requirements on online and offline traders alike. The proliferation of dark patterns in digital markets is placing this system of protection under pressure, however. The challenges posed by dark patterns concern where the law draws a line between acceptable and unacceptable B2C commercial practices in digital markets, as well as how it draws that line and the mechanisms that ensure that the line is not crossed. In the following sub-sections, I elaborate on these challenges.

1.1.1. Level of online consumer protection

When it comes to ensuring that markets work for consumers, legal measures can be roughly divided into those that empower consumers, i.e. put them in a better position to act in their own interest, and those that protect them, e.g. by prohibiting a certain commercial practice when a consumer is on its receiving end.

The EU consumer acquis relies on consumer empowerment measures to a great extent.23 This means that the legal instruments in question protect consumers who are generally able to protect themselves, but who sometimes need a bit of an extra boost in order to do so, mostly in the form of access to truthful, transparent and comprehensive information.24 These are so-called ‘average consumers’. Average consumers are able to absorb the information they are offered, and are able to see through the motives of commercial practices and, to some degree, withstand their pressure.25 The legal framework does envisage some lenience for particularly vulnerable consumers – which are defined based on certain personal characteristics, such as age26 – but overall, as Micklitz puts it, ‘the average consumer is the measure of all things’.27 These two corollaries – information disclosure and a consumer who is able to process this information – form the core of the so-called ‘information paradigm’, which is the protective logic underpinning much of the EU consumer acquis.28

The average consumer of EU consumer law bears a striking resemblance to the homo economicus of neoclassical economics – a perfectly rational market agent.29 The image of the homo economicus, and by extension that of the average consumer, has been repeatedly tarnished by the findings of behavioural economists over the past decades.30 It turns out that we are nowhere near as rational as the law and traditional economics assume us to be. Our rationality is bounded, at best;31 that is, we are not incapable of rational decision-making, but we are also systematically affected by behavioural biases.32 In the wrong hands, this is lucrative knowledge. While some dark patterns operate on an informational dimension, being reminiscent of the snake-oil-selling and shekel-reaping practices of the past, others capitalise on the exploitation of behavioural biases.

Admittedly, marketeers have long been aware of behavioural economics’ insights about us being predictably irrational, and have been incorporating this wealth of knowledge into their selling tactics33 – this is the reason why IKEA stores are a maze,34 and why supermarkets place pricier wines at eye level.35 As Hanson and Kysar put it in 1999, ‘once one accepts that individuals systematically behave in nonrational ways it follows from an economic perspective that others will exploit those tendencies for gain’.36 Whereas the novelty of behavioural exploitation may be questioned, its current scale is arguably unprecedented. The widespread use of dark patterns could accordingly suggest that it is high time to redraw the line between acceptable and unacceptable market conduct, and perhaps loosen the expectations that the law has of (average) consumers. Until fairly recently, the Commission has been resisting calls from academia and civil society organisations to review the current level of protection afforded to consumers in digital markets and otherwise. The 2017 Fitness Check of consumer and marketing law concluded that consumer law was, by and large, fit for purpose, and that the main source of harm for consumers stemmed from the under-enforcement of current laws.37 The Fitness Check led to the adoption of the 2019 Omnibus Directive,38 which, at a substantive level, introduced some provisions enhancing the transparency of online marketplaces and online reviews, but did not bring about any major overhauls to the logic underlying the system of protection, nor to its core provisions. Ten days before the Omnibus Directive became applicable in May 2022, the Commission initiated a Fitness Check on Digital Fairness, which purports to assess whether ‘action is needed to ensure an equal level of fairness online and offline’.39 At the time of completion of this study (April 2024), the Fitness Check is still ongoing, and, for the first time since the adoption of the UCPD, there appears to be a real prospect of line-redrawing.

1.1.2. Regulatory design

Redrawing the line prompts the question: what shape should the line take, or, in other words, how can we design effective regulation? Regulating behavioural phenomena effectively is a notoriously hard and uncertain task.40 Dark patterns are not just behavioural phenomena, however; they are also a socio-technical artefact – that is, they may result from the use of technology in the process of user experience optimisation. If regulating behavioural phenomena well is hard, the effective regulation of behavioural phenomena that take the form of socio-technical artefacts is bound to be even harder. One of the chief questions in this regard is whether regulation ought to be technology neutral or technology specific. Technology-neutral regulation is regulation that does not engage with a particular technology or its design and uses, and instead applies across the (analogue and digital) board. Technology neutrality in technology regulation is a long-standing and largely unchallenged mantra in policy41 and academic circles.42

Technology neutrality also won the battle when it came to the design of EU consumer protection rules. While e-commerce was nowhere nearly as popular when the UCPD was adopted in 2005 as it is nowadays, it was nevertheless gaining traction; yet the UCPD was set up as a technology-neutral instrument. The CRD, which was adopted in 2014, when e-commerce was neither new nor uncommon, contains some provisions that are specific to digital environments, but it still leaves traders a lot of wiggle room when it comes to the technical translation of these rules into the design of consumer-facing digital products such as user interfaces. At the same time, as the next sub-section discusses in detail, the rates of non-compliance in digital markets have historically been high, and remain so. While technology neutrality won the legislative battle, the war to ensure the effectiveness of consumer rights in digital markets is still ongoing. It cannot be ruled out that technology neutrality is a contributing or driving factor of non-compliance. Accordingly, it is high time to explore, on the occasion of line-redrawing in digital markets, whether the line ought to assume a more technology-specific shape.

1.1.3. Digital enforcement

Whichever regulations are adopted and whatever they look like, ultimately consumer rights are only as effective as their enforcement.

Public institutions face considerable shortcomings in addressing consumer harms in digital markets arising from the use of dark patterns and beyond. They need to oversee a dynamic and vast landscape of potential harm, where the traditional voices of complaint – consumers themselves – are rendered largely ineffective in raising the alarm against abusive business practices, with very scarce resources. Against this background, the levels of compliance with consumer law instruments in digital markets have historically been low. As early as 2011, the Commission stated: ‘you need only browse the internet for a short while to see that compliance with transparency and consumer information requirements is insufficient in relation to the placing of orders’.43 The 2014 impact assessment conducted by the Commission prior to the adoption of the new Consumer Protection Cooperation (CPC) Regulation44 showed that 37% of websites did not respect consumer rights, based on a conservative estimate.45 Subsequent website compliance checks by the CPC network have revealed that online consumer law violations continue on a rampant scale: over 50% of investigated websites were found to deploy non-compliant practices in 2019 and 2020.46 These figures strongly suggest that current enforcement efforts in the digital space are suboptimal.47

Naturally, in order to enforce the law on digital markets, authorities need first to be able to detect infringements. Market-monitoring efforts in the digital space typically take the shape of yearly sweeps, i.e. covert, coordinated and simultaneous compliance checks on consumer markets.48 On a practical level, sweeps typically entail several national consumer authority representatives manually combing through several hundred e-commerce websites in search of certain commercial practices at the same time. Sweeps are typically followed by targeted enforcement actions, and, according to the Commission, can lead to a significant improvement in compliance rates – from 20–40% improvement at the screening phase to above 80% after a year of enforcement actions.49 At the same time, the potential of manual sweeps to combat breaches of consumer law on digital platforms is severely limited, as they are time- and labour-intensive, and therefore can only cast light on the compliance landscape amongst a small number of online traders at a particular point in time. Some authors warn that the limited policing of digital markets may, in time, ‘lead to the lack of credible deterrence’,50 and others argue that it has already ‘allowed a sense of impunity in e-commerce’.51 Meanwhile, web measurement studies that rely on web scraping techniques to uncover digital user harms can include from several thousand52 to over 100,000 websites in their analysis,53 and allow for regular and continuous market monitoring. Leveraging computational methods like these in the public interest could further the effective enforcement of consumer protection acquis in a digital setting by opening up a significantly larger portion of the market to scrutiny. Technology does not necessarily have to only serve the interests of businesses; it could also be used to fortify consumers’ institutional defences against unfair commercial uses of technology by automating the detection of infringements.54 Acknowledging this, in late 2022 the Commission launched the EU e-Lab, ​​which is a project dedicated to ‘developing and applying advanced digital technologies to online consumer investigations in support of the Consumer Protection Cooperation (CPC) network’.55 It is therefore both worthwhile and timely to explore what the interaction of the substantive legal framework and the current state of technology means for the task of automatically detecting dark patterns on e-commerce websites.

1.2. Research questions and aims

To recap, the use of dark patterns by online traders challenges the level of protection we have traditionally afforded to consumers, the technology-neutral way we design rules to further consumer interests and the way we tackle encroachments on consumer rights. The effectiveness of EU consumer protection measures in digital markets is in question. Against this background, the main question this book deals with is: how can we regulate dark patterns effectively within the realm of EU consumer protection law, considering both legal and technical solutions? Answering this overarching question requires addressing several sub-questions:

  1. Do dark patterns need to be regulated?

  2. How could dark patterns be regulated effectively? What is the optimal balance between technology neutrality and technology specificity in technology regulation?

  3. How does the current substantive legal framework apply to dark patterns, and is it effective at tackling them?

  4. How could the effectiveness of the substantive legal framework be improved?

  5. What lessons can be learnt from the interaction of the substantive legal framework and computational methods for the task of automatically detecting unlawful dark patterns?

In answering these questions, this study’s aim is threefold. First, in light of the ongoing Fitness Check, it seeks to find policy solutions to the flourishing of dark patterns in digital consumer markets, and on e-commerce websites56 specifically.

Policy is, however, only one part of the solution to any regulatory problem – compliance with policy needs to be monitored, and where non-compliance is detected, policy needs to be enforced. In digital markets, for monitoring efforts to reflect the potential scale of non-compliant practices, public authorities may want to deploy technical solutions. My second aim is therefore to offer some insights as to the feasibility of developing computational methods for the automated detection of unlawful dark patterns on e-commerce websites. This is a question both of technical capabilities and of the translation of legal goals into computational tasks.

Third, in the course of mapping these solutions, the book offers a view into a world where the regulation of socio-technical artefacts takes on a more technology-specific shape than it has to date, and so challenges the current policy and academic assumption in favour of technology neutrality. My ambition in this regard is not that we entirely abandon this regulatory approach, but that we open up the room to debate what degree of technology neutrality would be appropriate for different kinds of regulatory problems. E-commerce is turning 30 this year,57 and the European Commission is going back to the drawing board, so this is an opportune moment to reflect on what we could have done and could still do differently, and perhaps better.

1.3. Theoretical lens and key terms

This book is about effective technology regulation. In what follows, I elaborate on how I view ‘technology’, ‘regulation’ and ‘effectiveness’.

1.3.1. Technology

When it comes to the regulation of technology, law and technology scholars can approach the issue from two broad perspectives: technological exceptionalism and the theory of socio-technical change. These outlooks, according to Jones, shape ‘how sociotechnical legal problems are imagined and shaped and how they are answered’.58 Adepts of technological exceptionalism seek to identify the essential characteristics of a new technology that distinguish it from what we have seen before, justifying its regulation.59 There are several issues with technological exceptionalism when it is used as a theory of techno-regulation. First, its regulatory agenda is driven by newness; this could lead to a regulatory blind spot when the technology is not sufficiently new, as well as when it is not exceptional enough.60 It is also not entirely clear what it means for a technology to be exceptional – as Wu points out, exceptionalism ‘depends on what you might think it is an exception to’.61 Second, it focuses on the properties of technology qua technology, i.e. its capabilities.62 As Jones explains, technology is far more than that – it ‘includes the physical objects, know-how, personnel, organisations and systems, and political and economic power’.63 The narrow exceptionalist understanding of technology also means that exceptionalist analysis tends towards technological determinism: if the technology that concerns us is ‘new’, then societal changes, including changes to the law, must necessarily follow technology. On this reactive view of the law, it is doomed to play catch-up with technology, and constantly miss its mark due to the relative slowness of the legal system compared to the speed of technological innovation.64 This perspective also disregards the fact that technology rarely, if ever, emerges in a legal vacuum. The societal and, by extension, legal context in which a technology emerges or gains traction may have played a role in its invention, adoption or shaping.65 Crootof and Ard warn us that the exceptionalist approach ‘fosters siloed and potentially incomplete analyses, masks the repetitive nature of the underlying questions, and thereby results in the regular reinvention of the regulatory wheel’.66 Indeed, legal scholars have been (re)addressing the issues surrounding driverless cars since the 1980s.67 What is more, technology exceptionalism not only has shortcomings in prescriptive terms, but also may not actually hold in descriptive terms: in 2018, Jones juxtaposed ‘revolutionary’ technologies from various time periods with the legal frameworks of those times and showed that the latter were not necessarily influenced by the essential qualities of the former.68

An alternative way of looking at technology is offered by the proponents of the theory of socio-technical change, championed by Bennett Moses.69 Rather than focusing on what is new or exceptional about technology, scholars following this line of inquiry zoom in on how a particular technology is used, why, by whom and to what effect. On this view, it is therefore not technology per se that needs regulation, but rather particular uses of technology that may lead to harm. The potential or presence of harms is assessed with reference to well-established rationales for intervention.70 The theory of socio-technical change also foresees a bidirectional relationship between law and technology; it acknowledges the fact that the legal context in which a particular socio-technical practice emerged may have played a role in its emergence or shaped it.71 As Maas argues, this perspective facilitates a more fruitful analysis of when and why new technologies or novel applications of old technologies require regulation.72 Throughout this book, following this outlook, I look at dark patterns as a socio-technical artefact – one of the potential outcomes of the user experience design process, which may be influenced by technical, organisational, economic and regulatory considerations. This means that I start my inquiry by first looking into the technological, organisational and economic factors that may lead to the development of dark patterns, and take these factors into account when I talk about their regulation and its effectiveness. That being said, in what follows, I may employ the term ‘technology regulation’ when referring to the regulation of socio-technical artefacts for the sake of brevity.

1.3.2. Regulation

Talking about regulation also requires me to clarify how I view regulation. There is no one universal way to think about this. Baldwin et al. outline three broad perspectives on regulation: as a specific set of commands, i.e. binding sets of rules applied by a body devoted to this purpose; as deliberate state influence, i.e. all state actions (command-and-control, as well as measures like economic incentives and communication) deliberately aimed at influencing behaviour; and as all forms, state-based or otherwise (e.g. markets) of social or economic influence, whether deliberate or not.73 Morgan and Yeung (rightfully) point out that ‘[...] scholars have sought to classify regulatory instruments in many ways, none of which can claim pre-eminence’74 and that ‘no scheme of classification is watertight’.75 At the same time, hardly anyone will disagree with the statement that for lawyers (me included), command-based regimes are a paradigmatic example of regulation. In devising legal solutions in this study, command-and-control rules are what I am mostly concerned with. That being said, there is more to any control system than mere rule-promulgation. A system of control that is worth its salt, according to Hood et al., requires:

[...] some capacity for standard-setting, to allow a distinction to be made between more or less preferred states of the system. There must also be some capacity for information-gathering or monitoring to produce knowledge about current or changing states of the system. On top of that must be some capacity for behaviour-modification to change the state of the system.76

In other words, once we have legal policies in place, regulatees’ compliance with them needs to be monitored (information gathering) and enforced (behaviour modification). I adopt a less conventional perspective, looking beyond legal solutions, when it comes to my vision of how public authorities could exercise their market-monitoring function. More specifically, I argue that if public authorities turned to computational tools to monitor (non-)compliance in digital consumer markets, this might increase the effectiveness of their investigative efforts, and, by extension, lead to more effective enforcement of consumer rights.

1.3.3. Effectiveness

This brings me to the last term – ‘effectiveness’. If there is, amongst legal scholars, a more avidly debated term than ‘regulation’, ‘effectiveness’ must be it. In simple terms, effectiveness is about regulation being able to achieve some desired aims.77 Whether it actually does so is ultimately an empirical question, which can be answered using the kind of methods the Commission advertises in its (aptly named) Better Regulation Guidelines.78 My aims in this regard are more modest. I approach effectiveness from a theoretical (more on this below) and relative lens.79 As Brownsword explains, if we approach effectiveness as an all-or-nothing matter of complete control, most regulatory interventions will fail the test.80 Instead, we could treat effectiveness as a spectrum, and ask how effective an intervention is.81 On this view, a regulatory intervention could be judged to be more effective where it constitutes an improvement over the ex ante state of affairs.82 When discussing potential adjustments to the way the EU consumer law regime currently addresses dark patterns, I am therefore referring to a specific regulatory approach potentially being more effective than what we have been doing so far in achieving the desired aims.

1.4. Methodology and structure

As the previous sub-section explains, I view dark patterns as a socio-technical artefact. This prompts me to start the book with Chapter 2’s interdisciplinary investigation of the technical, organisational and economic considerations behind user interface design. This chapter also introduces some key terminology that a legal reader may find helpful when it comes to understanding the tech-savviest part of this book – Chapter 8, which deals with computational methods for market monitoring. For this (dual) purpose of contextualising dark patterns and explaining central technological terms, I draw on computer science literature and resources, specifically from the fields of web development and HCI, as well as marketing resources related to e-commerce.

In Chapter 3, I zoom in on dark patterns as discussed in HCI literature and identify several factors that may make attempts to regulate their use difficult. In recent years, HCI scholars have generated an impressive amount of taxonomies of dark patterns. While this work is descriptively rich, some conceptual difficulties remain when trying to pin down what exactly dark patterns are. The chapter therefore starts by surveying definitions of dark patterns and defining them for the purpose of this study. Pinpointing a definition for regulatory purposes remains a difficult task, however. I then explain how dark patterns work from a behavioural sciences perspective. This understanding is necessary for discussing why dark patterns may be problematic from a consumer policy perspective. Next, I delve into the descriptive, taxonomical landscape of HCI scholarship, summarising the findings of prior studies that have documented dark patterns.83 This exercise allows me to illustrate the breadth and constant evolution of the dark patterns problem, which may pose issues for regulators, as well as to select the dark patterns I will focus on from this vast landscape of potential consumer harms. More specifically, I zoom in on the Shopping dark patterns84 found by Mathur et al. in their 2019 crawl of popular shopping websites.85 The added value of focusing on this particular taxonomy is twofold. First, Mathur et al.’s was the first, and, to this day, remains the only taxonomy of Shopping dark patterns that is based on the findings of a large-scale web measurement study, and it therefore reflects practices that are prevalent ‘in the wild’. Second, Mathur et al. have developed semi-automated methods for the detection of dark patterns on shopping websites, and their study therefore provides a strong foundation to build on when it comes to exploring whether the detection of unlawful dark patterns can be fully automated. I then survey behavioural studies that probe users’ reactions to Shopping dark patterns. This evidence helps me assess, in Chapter 4, whether regulators ought to intervene to tackle the use of dark patterns in consumer markets. Chapter 3 concludes with a reflection on the current state of the art regarding dark patterns, the issues this may pose for regulators and a discussion about where digital traders may take dark patterns next.

At this point I switch focus from technology and its artefacts to regulation. I start this exercise by constructing a theoretical framework in Chapter 4 that allows me to establish whether dark patterns should be regulated (sub-question 1) and how we could regulate their use effectively (sub-question 2). I then apply this framework in later chapters to gauge the effectiveness of the current EU consumer law regime as it applies to dark patterns, and to propose how lines could be redrawn. The starting point of my theoretical framework is the theory of socio-technical change. As the previous sub-section discusses, adepts of the theory of socio-technical change do not treat new technology as problematic in and of itself, but rather assess whether regulation is necessary in light of established regulatory rationales. There are two conceptual frameworks for incorporating behavioural insights into consumer policy – behavioural law and economics and autonomy theory. I assess the arguments for regulatory intervention from both of these perspectives; the question of whether consumer law is concerned with well-functioning markets or fairness is a long-standing and ongoing debate amongst consumer law scholars that will surpass this study and perhaps does not even need arbitrating, as both sides bring something valuable to the table. The chapter then explores what shape regulatory efforts around dark patterns could take. As Kaminski puts it, ‘regulatory design is a perennially central issue for law and technology’.86 One of the most important questions regulators have to answer in this regard is whether regulation ought to assume a more technology-neutral or technology-specific shape. The theory of socio-technical change does not operate on a prescriptive dimension, but the literature has produced rich explanations of why, and for what purposes, either strategy could work – or not, which I apply to the governance of consumer markets in the EU in the following chapters.

Chapter 5 describes the logic underpinning EU consumer law and how that logic translates into the protective provisions of the UCPD and CRD; it also discusses why dark patterns are challenging this logic. I chose to focus on these instruments due to their horizontal (i.e. cross-sectoral) and maximum harmonisation (i.e. leaving MS little to no leeway to deviate in their national laws) character, as well as their applicability to commercial practices occurring at the pre-contractual stage, which is where consumers will encounter most Shopping dark patterns. While there may be other EU law instruments that may be relevant to curbing the use of dark patterns on e-commerce websites – such as the e-Commerce Directive (ECD),87 which applies to information society services, including services provided by website operators that allow consumers to buy products – this instrument is not part of the consumer acquis. Since I embarked on this project, a substantial body of EU digital acquis has emerged and some of these instruments deal explicitly with dark patterns: Art. 25 of the Digital Services Act (DSA)88 prohibits the use of dark patterns by online platform providers. However, the personal scope of the DSA is restricted to online platforms, whereas I deal with both online shops and online platforms - marketplaces, albeit only in their consumer-facing, selling (rather than intermediating) function.

In Chapter 6, I look at Mathur et al. dark patterns through the lens of these instruments. I therefore engage in doctrinal legal analysis to explore how the current legal framework applies to dark patterns and what bottlenecks arise in this process (sub-question 3). I use a variety of sources to facilitate this analysis – European case-law and national enforcement decisions and cases; preparatory texts, guidelines and studies conducted or commissioned by relevant public bodies (e.g. consumer authorities and organisations); and legal scholarship. At the end of the chapter, I also report on the level of protection afforded to consumers, as well as the interpretative issues that I encounter in this exercise. I also explore how these issues relate to the relative technological neutrality of the current legal framework,89 and how they may be linked to the low levels of compliance amongst digital market participants.

In Chapter 7, I envision how a more effective substantive legal framework could look like (sub-question 4). To chart the policy possibilities in this regard, I look at recent EU digital platform regulation (the DSA); recently proposed consumer law instruments (the proposed Distance Marketing of Consumer Financial Services Directive90 and the proposed Directive Empowering Consumers for the Green Transition);91 national solutions; and studies and policy documents produced by various actors (the Commission, national policymakers and enforcement authorities, consumer organisations, academics) to inform the ongoing Digital Fairness Fitness Check.92 I review the solutions proposed in these documents in light of the framework I developed in Chapter 4, and formulate some future directions for consumer policy that my normative analysis leads has led me to.

Chapter 8 then looks at what the current legal framework and technological state of the art mean for the task of automatically detecting unlawful dark patterns (sub-question 5). To be clear, this study does not purport to actually test the readiness of computational methods to automate infringement detection or develop tools to that effect. Instead, I engage in a qualitative analysis of technical feasibility, which is based on a survey of prior dark pattern detection studies and more general web measurement literature.93 Web measurement is a relatively new subfield of privacy and cybersecurity studies, which is concerned with observing websites and services at scale to detect, characterise and quantify web-based phenomena. In the course of assessing technical feasibility, I also probe the relationship between the way substantive policy is formulated (in more or less technology-specific terms) and the development of computational methods for detecting unlawful dark patterns. The chapter ends with a reflection on the role of technology specificity for both effective policymaking and market monitoring, followed by some recommendations for policymakers and enforcement authorities.

Chapter 9 summarises the conclusions of my analysis, reflects on the study’s contributions and limitations and puts forward some questions for future research.

1.5. Contribution

This book offers a novel way to think about the effectiveness of EU consumer law in digital markets by building bridges between consumer law scholarship and the literature on technology regulation in law and technology scholarship, which remains a relatively underexplored intersection amongst consumer law scholars.94 In doing so, it constructs a theoretical framework that could help establish whether the use of dark patterns ought to be regulated, and how that may be achieved to improve the level of protection afforded to consumers while maximising online traders’ compliance with consumer laws. It then uses this theoretical framework to offer a new analysis, informed by regulation theory, of the effectiveness of EU consumer law instruments in regulating dark patterns. While consumer law scholars have engaged with the behavioural dimension of the dark patterns problem,95 the regulatory issues (and solutions) that stem from the fact that dark patterns are a socio-technical artefact remain underexplored. Based on the lessons learnt from how we have regulated these socio-technical artefacts in the past, I offer some policy recommendations, which could inform the ongoing Digital Fairness Fitness Check.

However, whatever laws we may devise, compliance may not improve if infringements go undetected and subsequently unpunished. I therefore go beyond exploring just policy solutions and look – from a technical perspective – at whether and how web measurement methods could be used to monitor compliance with EU consumer laws on e-commerce websites. The theoretical positioning of this study also allows me to assess how regulatory design may affect the feasibility of developing computational tools for this purpose. In doing so, I show how technology specificity in policy may be a dimension of effective compliance monitoring, which is a contribution to the academic discourse about regulatory design in socio-technical environments.

1.6. Assumptions and limitations

Throughout my analysis I take it as a given that public authorities, be they policymakers or enforcement authorities, are well-intended and want to act in pursuit of the public interest. This stance, if applied across the whole body of civil servants and elected officials, is arguably a stretch. It is also possible, for example, to look at elected officials as predominantly interested in re-election and therefore susceptible to ideological capture by industrial lobby groups.96 The truth is likely somewhere in the middle, and possibly in the eye of the beholder – as Baldwin et al. posit: ‘different aspects of a regulatory development will resonate with different regulatory theories’.97 Taking the ‘blue pill’98 by assuming that regulators are benevolent is a compromise I deem acceptable in order to keep my investigation within manageable bounds. Further, we may still learn some valuable lessons from this imperfect depiction of a perfect world.

The scope of my study is also constrained in other ways. Dark patterns may be deployed on many platforms (e.g. social media, video games, mobile apps) for a variety of purposes: the maximisation of personal data collection (Privacy), as well as that of the time (Engagement) and money (Shopping) users spend on a platform. However, my investigation is concerned with Shopping dark patterns on e-commerce websites, by which I mean online stores – i.e. businesses that sell products directly to consumers – and online marketplaces in their capacity as consumer-facing sellers (rather than as intermediators of B2C transactions). My narrow focus does not, however, mean that my conclusions would not be applicable to other dark patterns, as they are underpinned by a relatively technologically neutral (pun intended) theoretical framework.

Perhaps a more fundamental constraint is that the space of regulatory solutions I explore is limited. As I explain above, I deliberately opt in favour of a state-centric and legally binding view of regulation, but it is possible to look at regulation in broader terms, as stemming from any source, and encompassing any attempt to influence another’s behaviour. On this view, there is a much broader scope for intervention: we could also look at what actors on the ground, such as individual technologists (user experience and interface designers, and web developers), professional associations and tech companies,99 as well as computing educators100 could do to combat the use of dark patterns. Even in state-centric terms, there is a broader array of policy solutions that could be considered. For example, regulators could target their interventions somewhere upstream of B2C interactions – they could tackle issues in the Software-as-a-Service landscape, where a lot of the elements of consumer-facing user interfaces are offered to traders as a service, or they could regulate technologists in the same way that other service providers who could cause harm while carrying out their work, such as doctors and lawyers, are regulated. Any and all of these solutions could lend a helping hand in tackling dark patterns, but they are not within my purview as a consumer lawyer.

Lastly, this is desktop research based on an interdisciplinary survey of the relevant behavioural, computer science, regulation-theory and legal literature and resources. My conclusions herein will therefore need to be tested in the real world, but what is technology regulation if not an experiment?101 Now, let us move on to how we may regulate dark patterns effectively in EU consumer law.

Comments
0
comment
No comments here
Why not start the discussion?