Dark Patterns, Digital Dominance, and the Resurgence of Unfair Trading Conditions: A New Era for EU Competition Law?
The Revival of a Dormant Doctrine
The digital economy is a marvel of innovation, but casts a long shadow. Beneath the sleek interfaces of apps and platforms lurk dark patterns and deceptive designs—manipulative tactics that exploit user behaviour for profit, often at the expense of autonomy, privacy, and trust. From subscription traps to privacy-invasive defaults, these practices are not mere design quirks but systemic tools of market power. A groundbreaking working paper by Professor Anne C. Witt’s “Unfair trading conditions - the revival of a dormant concept of EU competition law,” published in May 2025, offers a rigorous framework to confront these practices under Article 102(a) of the Treaty on the Functioning of the European Union (TFEU). This post synthesises Witt’s insights, categorises their implications, and applies her reasoning to dark patterns, with a deep dive into three pivotal cases: Autorité de la concurrence v Apple — App Tracking Transparency (2025), Bundeskartellamt v Meta — Excessive Data Collection (2019), and Autorité de la concurrence v Google — Neighbouring Rights (2020).
Article 102(a) TFEU prohibits dominant firms from imposing unfair prices or trading conditions. Historically, European competition agencies prioritised exclusionary abuses—practices that harm competitors—over exploitative ones, which directly harm consumers or business partners. Exploitative abuses, particularly non-price-based unfair trading conditions, were rarely pursued due to the scarcity of judicial precedent and the perceived vagueness of “fairness”. Witt’s paper documents a seismic shift: a resurgence of Article 102(a) enforcement targeting the exploitative practices of Big Tech in digital markets.
Recent cases underscore this revival:
Apple (2024): The European Commission fined Apple €1.8 billion for anti-steering provisions in its App Store. These provisions prevented developers from informing users about cheaper alternatives and were deemed unfair for their disproportionate consumer harm.
Meta (2024): The Commission found Meta’s use of user data to favour its Facebook Marketplace unfair, as it disadvantaged competing online classified ad services (OCAS).
National Cases (2019–2025): France’s Autorité de la concurrence and Germany’s Bundeskartellamt targeted Google, Apple, and Meta for practices ranging from data collection to contractual terms, leveraging Article 102(a) alongside other laws, such as the GDPR and press rights.
Witt argues that this revival reflects enforcement gaps, the complexity of exclusionary abuse analyses, and a broader reintegration of fairness as a goal of competition law, moving beyond the narrow consumer welfare paradigm of the “more economic approach.”
Witt’s Analytical Framework: The Proportionality Test
Despite apparent differences, Witt’s central thesis is that European competition agencies have converged on a proportionality test to assess unfair trading conditions. This test, articulated in the Commission’s Apple (music streaming) decision, evaluates whether a condition:
A dominant firm imposes it;
Causes detriment to trading partners or consumers; and
It is unnecessary or disproportionate to the firm’s legitimate objective.
This framework echoes the United Brands test for excessive pricing, which deems a price unfair if it lacks a reasonable relation to a product’s economic value. Both tests hinge on proportionality, balancing the dominant firm’s aims against the harm inflicted. Witt reconciles the two by framing unfair conditions as practices that inflict disproportionate disadvantages, whether through pricing or non-pricing terms, such as data collection or interface design.
The proportionality test is intellectually robust but raises challenges:
Causality: Must agencies prove that the dominant position enabled the unfair condition? The Commission denies this, but French and German approaches suggest some link, creating uncertainty.
Harm: Should harm target consumers, business partners, or both? Economic harm (e.g., financial loss) is apparent, but non-economic harm (e.g., privacy violations, democratic impacts) is a matter of contention.
Legal Certainty: The subjective nature of “fairness” risks unpredictable enforcement, necessitating more precise guidance.
Categorising Witt’s Insights
Witt’s paper offers four conceptual pillars:
Doctrinal Reanimation:
Exploitative abuse, sidelined by the 2009 Guidance on Article 102, is re-emerging as a flexible tool to address digital market failures. Witt traces its roots to Continental Can (1973), where the European Court of Justice recognised that Article 102 targets direct consumer harm, not just competitive foreclosure.
Institutional Motivations:
Agencies are filling enforcement gaps in consumer protection, data protection, and press freedom, where other frameworks lack sufficient enforcement. The complexity of exclusionary abuse analyses, intensified by the more economic approach, also prompts agencies to adopt simpler, exploitative theories. Additionally, fairness is re-emerging as a legitimate aim, reflecting societal concerns about digital concentration and inequality.
Comparative Convergence:
The Commission, Bundeskartellamt, and Autorité de la concurrence apply variants of the proportionality test, aligning with EU enforcement. This convergence suggests a pan-European standard for exploitative abuse, distinct from the Chicago School’s price-centric focus.
Legal Certainty and Policy Implications:
The proportionality test’s flexibility threatens predictability, but Witt argues it aligns with excessive pricing case law (Aspen, DSD). She advocates for Commission guidelines to define legitimate objectives and clarify harm, enhancing enforcement consistency and democratic legitimacy.
Dark Patterns: Exploitative Design as Unfair Trading Conditions
Dark patterns—interface designs that manipulate users into unintended choices—are pervasive in digital platforms. Examples include subscription traps (e.g., complex cancellation processes), privacy-invasive defaults (e.g., pre-ticked consent boxes), and misleading urgency (e.g., fake countdown timers). These practices exploit cognitive biases, extracting time, money, or data in ways users cannot easily avoid, especially on dominant platforms.
Witt’s proportionality test offers a powerful lens to address dark patterns as unfair trading conditions:
Imposed by Dominance: Due to their market power, dominant platforms, such as social media giants or app store operators, can dictate terms, leaving users with no viable alternatives.
Detriment: Dark patterns harm users through financial loss, privacy violations, or reduced autonomy, often without transparent justification.
Disproportionate: Manipulative designs often exceed what is necessary for legitimate aims (e.g., profit, service quality), as less coercive alternatives exist.
This framework shifts the focus from pricing to systemic exploitation, making competition law a potent tool against dark patterns where consumer protection laws fall short. To illustrate Witt’s reasoning's applicability, I analyse three landmark cases, focusing on their treatment of unfair trading conditions and implications for dark patterns.
Autorité de la concurrence v Apple — App Tracking Transparency (ATT) (2025):
Context: In March 2025, France’s Autorité de la concurrence found Apple’s App Tracking Transparency (ATT) framework unfair under Article 102(a). ATT requires apps to obtain user consent for tracking via a pop-up prompt. Still, the Autorité argued that Apple’s implementation favoured its advertising services while imposing restrictive conditions on third-party advertisers. The prompt’s design and default settings allegedly nudged users toward denying tracking for competitors, reducing their ad revenue, while Apple’s data collection faced fewer hurdles.
Proportionality Analysis: The Autorité applied a proportionality test, finding that ATT’s restrictions were imposed by Apple’s dominant position in iOS app distribution. The detriment was apparent: third-party advertisers faced reduced access to data, which harmed their competitiveness. While Apple claimed ATT protected user privacy—a legitimate aim—the Autorité deemed the framework disproportionate, as Apple’s services benefited from less restrictive data access, undermining the privacy rationale. The prompt’s design, suggestive wording and default bias resembled a dark pattern, manipulating user choice to favour Apple.
Dark Pattern Implications: ATT’s consent interface illustrates how dark patterns can manifest as unfair trading conditions. The nudge toward denying competitors’ tracking, embedded in a dominant platform’s architecture, exploits user inertia and distorts the competitive landscape. Witt’s framework validates this approach, as the Autorité balanced Apple’s privacy claims against the harm to advertisers, finding the design unnecessarily restrictive.
Bundeskartellamt v Meta — Excessive Data Collection (2019):
Context: In February 2019, Germany’s Bundeskartellamt found that Meta (then Facebook) abused its dominant position in social networking by imposing excessive data collection terms on users. Meta collected data from third-party websites and apps without user consent, violating GDPR principles. The agency argued that these terms were unfair trading conditions, as users had no realistic alternative to Meta’s platform due to its market dominance.
Proportionality Analysis: The Bundeskartellamt’s approach aligned with Witt’s proportionality test. Meta’s dominance enabled it to impose non-negotiable data terms (condition 1). The detriment was twofold: users lost privacy and autonomy, and competitors were disadvantaged by Meta’s data advantage (condition 2). While Meta argued that data collection was necessary for personalised services, the agency found it disproportionate, as GDPR offered less invasive alternatives (condition 3). The agency’s reliance on the GDPR as a benchmark for fairness was innovative, framing excessive data collection as a dark pattern that exploits user consent.
Dark Pattern Implications: Meta’s opaque consent mechanisms—buried in complex terms or presented as take-it-or-leave-it choices—function as dark patterns, coercing users into sharing their data. Witt’s analysis supports the Bundeskartellamt’s approach, as it balanced Meta’s service objectives against user harm, using GDPR to ground the proportionality assessment. This case highlights how competition law can address non-economic harms, such as privacy violations, thereby expanding the scope of dark pattern enforcement.
Autorité de la concurrence v Google — Neighbouring Rights (2020):
Context: In April 2020, the Autorité de la concurrence issued an interim measures decision against Google for imposing zero-price licenses on French press publishers for displaying their content in search results. This followed France’s implementation of EU Directive 2019/790, granting publishers “neighbouring rights” to negotiate compensation for content use. Google’s refusal to negotiate and unilateral imposition of zero-price terms were deemed unfair trading conditions under Article 102(a).
Proportionality Analysis: The Autorité found that Google’s dominance in search enabled it to dictate terms to publishers (condition 1). The detriment was significant: publishers lost revenue and bargaining power, undermining their ability to fund journalism and contribute to democratic discourse (condition 2). Google claimed its terms ensured free access to content, but the Autorité deemed them disproportionate, as negotiation was feasible without compromising Google’s service (condition 3). The agency emphasised non-economic harm, linking Google’s conduct to democratic values.
Dark Pattern Implications: While not a classic dark pattern, Google’s imposition of non-negotiable terms resembles a coercive design choice, forcing publishers into an unfair exchange for search visibility. Witt’s framework supports the Autorité’s focus on non-economic harm, balancing Google’s objectives against the systemic impact on publishers and democracy. This case illustrates how Article 102(a) can address exploitative terms beyond user interfaces, targeting the power of platforms in content ecosystems.
Why Competition Law Offers a Unique Avenue for Addressing Dark Patterns
Although Professor Witt does not engage directly with dark patterns, her analysis provides a valuable foundation for extending Article 102(a) TFEU to their regulation. Her central contention—that competition agencies have converged on a proportionality-based approach to unfair trading conditions—offers a credible legal test for scrutinising manipulative design practices imposed by dominant firms. This is not merely a doctrinal innovation but a necessary adaptation to the realities of platform power in the digital economy.
Dark patterns epitomise the kind of exploitative conduct that eludes correction by market forces. They are deliberately embedded within user interfaces by firms with sufficient market power to make those designs effectively unavoidable. These manipulative architectures—whether pre-ticked boxes, misleading urgency cues, or coercive consent flows—undermine user autonomy and extract economic or behavioural concessions that would not occur under conditions of genuine choice. In this respect, they constitute a textbook case of market failure: users cannot exit, meaningfully negotiate, or discipline these practices through consumer behaviour alone. Consumer protection laws, although relevant, are often reactive, fragmented across jurisdictions, and procedurally ill-suited to address these systemic manipulations at scale.
Competition law, particularly through the lens of Article 102(a), is uniquely positioned to address this regulatory gap. First, it brings robust investigatory powers and procedural instruments that most consumer authorities lack. As the Meta and Google cases illustrate, competition agencies can conduct detailed, factual, and economic assessments of how data terms or interface architectures harm competitors and users. Second, and perhaps more profoundly, EU competition law is undergoing a normative recalibration. The emergence of fairness and non-economic values—such as informational self-determination, media pluralism, and the integrity of democratic processes—as legitimate regulatory aims marks a shift away from the narrow consumer welfare paradigm that defined the “more economic approach.” Witt captures this shift precisely, identifying the return of fairness not as a rhetorical flourish but as a structural evolution of enforcement logic.
Dark patterns sit at the intersection of these concerns. They are inherently exploitative, frequently unfair, and disproportionately intrusive relative to any legitimate business objective. As such, they are amenable to the proportionality test Witt distils from recent jurisprudence: They are imposed by dominant firms, they detriment users, and they lack necessity or proportionality in achieving commercial aims. Whether analysed in terms of degraded user experience, coerced consent, or the suppression of effective competition through interface asymmetries, dark patterns meet the criteria for exploitative abuse.
Anticipated Challenges and Pathways Forward
The application of Article 102(a) to dark patterns is not without difficulty. Chief among the challenges is the inherent indeterminacy of the proportionality test itself. While flexible and adaptable, it risks inconsistent application without clear benchmarks. Witt rightly identifies the need for interpretative guidance from the European Commission to articulate what constitutes a legitimate objective and how disproportionality should be measured in digital contexts. This need is particularly acute for dark patterns, where the harm is often psychological, non-monetary, or epistemic, and where traditional economic indicators offer limited traction.
Moreover, concerns about the expansion of competition law into adjacent regulatory domains—such as data protection and democratic integrity—are legitimate. Witt acknowledges the danger of regulatory overreach but persuasively argues that cross-referencing parallel legal frameworks, such as the GDPR or the Platform-to-Business Regulation, can enrich the proportionality analysis without displacing the core function of competition law. This is not about importing alien values into Article 102(a) but about recognising that exploitative conduct in digital markets often materialises through practices that straddle multiple legal regimes.
Judicial review will also play a decisive role in shaping the durability of this doctrinal expansion. The pending appeals in the Apple and Meta cases will likely clarify the evidentiary thresholds and legal contours of unfair trading conditions. Until then, cautious but assertive enforcement—grounded in reasoned analysis and supported by cross-agency cooperation—remains prudent.
In the interim, three regulatory strategies merit consideration. First, the Commission should issue formal guidelines clarifying the application of the proportionality test to manipulative design practices, drawing on its analytical depth in Apple ATT and its data-driven reasoning in Meta. Second, enforcement agencies should formalise collaboration with consumer and data protection regulators to ensure coherence and avoid fragmented remedies. Third, where legal uncertainty remains high, agencies should consider commitments, such as interface redesigns or data use restrictions, as a proportionate and constructive means of remediation, rather than defaulting to punitive sanctions. This mirrors the remedy logic applied in Apple ATT, where structural changes were arguably more effective than monetary penalties in altering platform behaviour.
In sum, while competition law is not a panacea, it is one of the few legal instruments capable of confronting the systemic harms posed by dark patterns. Witt’s doctrinal scaffolding provides the tools. It is now up to regulators, courts, and scholars to operationalise them in ways that make the digital economy competitive and fair.
The Bigger Picture: Fairness in the Digital Age
Witt’s analysis reflects more than a doctrinal refinement; it signals a broader reorientation of EU competition law toward fairness as a normative and functional organising principle. For decades, fairness was sidelined by the “more economic approach,” dismissed as too vague or too political for rigorous competition enforcement. Today, that view is increasingly untenable. In the context of the digital economy—defined by platform dominance, behavioural manipulation, and data asymmetries—fairness is re-emerging not merely as a policy ambition but as a legal imperative.
The proliferation of dark patterns—deceptive consent mechanisms, coercive defaults, and interface-level manipulations—exemplifies the structural harms of digital concentration. These are not isolated design choices; they are engineered strategies deployed by firms with gatekeeping power to entrench dominance and extract value from users. While the Digital Markets Act (DMA) attempts to address this structural asymmetry through per se obligations imposed on designated gatekeepers, its scope is necessarily limited. As Witt observes, Article 102(a) retains critical value for addressing exploitative conduct by dominant firms falling outside the DMA’s threshold criteria and for tackling abusive practices—like dark patterns—that are not expressly captured within the DMA’s rulebook.
Witt’s proportionality-based framework offers a means to reassert the role of competition law in addressing the lived realities of digital exploitation. This approach does not abandon economic analysis but enriches it by integrating considerations of informational asymmetry, behavioural coercion, and user autonomy. In doing so, it aligns competition enforcement with broader societal concerns about privacy erosion, democratic vulnerability, and the distributional effects of platform capitalism.
This reorientation is mirrored in the evolving political discourse. From Executive Vice-President Margrethe Vestager’s insistence that competition law must reflect “what fairness means in people’s lives,” to the EU’s “Clean, Just and Competitive Transition” agenda, fairness has re-entered the regulatory lexicon not as a nostalgic nod to ordoliberalism, but as a forward-looking response to the systemic failures of digital markets. At stake is not merely transactional harm, but the architecture of choice, consent, and participation in the digital public sphere.
Yet despite this rhetorical and institutional momentum, the European Court of Justice has not squarely addressed the contours of fairness under Article 102(a). Judicial clarification will be crucial in determining whether the proportionality test withstands legal challenge and whether “unfair trading conditions” can encompass the full spectrum of digital exploitation, including dark patterns. Until then, the doctrinal scaffolding Witt offers remains both promising and provisional.
In this context, fairness should not be misunderstood as a vague or emotive standard. Properly operationalised, it functions as a constitutional principle of digital market governance: a check on excessive platform power, a lens through which exploitative practices can be disciplined, and a normative anchor for interpreting the scope of dominant firms’ responsibilities in a data-driven economy. Article 102(a) thus becomes not merely a residual clause, but a site of regulatory renewal—capable of addressing the exploitation beneath the interface, where the visible elements of user choice are carefully choreographed to serve invisible business imperatives.
Conclusion
Professor Witt’s work does not explicitly discuss dark patterns or deceptive design. However, her analysis of the proportionality-based revival of Article 102(a) TFEU provides a compelling legal scaffold for addressing these manipulative interface practices through the lens of exploitative abuse. I argue that dark patterns—coercive defaults, deceptive prompts, choice architecture distortions—can and should be recognised as non-price-based unfair trading conditions when imposed by dominant platforms in digital markets.
These manipulative designs, while historically policed under consumer protection or data law, are structural expressions of digital market power. They are rarely negotiable, disproportionately burden users, and serve no function that less intrusive alternatives could not fulfil. In other words, they meet both factually and normatively the proportionality test Witt identifies across recent enforcement actions by the Commission, the Bundeskartellamt, and the French Autorité de la concurrence.
Applying this framework to dark patterns repositions competition law as a credible and necessary tool in closing systemic regulatory gaps. While consumer and data protection regimes often target outcomes, such as misleading consent or excessive data collection, Article 102(a) permits a more systemic inquiry into how such conditions arise: not from isolated design quirks but power asymmetries encoded in platform architecture. The Apple ATT case, for instance, showcases how interface design can simultaneously restrict competitor access, manipulate user choice, and entrench a dominant firm's market position, making it a quintessential example of a potentially exploitative abuse.
Yet challenges remain. As Witt rightly stresses, legal certainty in this area is fragile. The proportionality test offers flexibility, but its application to novel digital harms—like dark patterns—demands doctrinal precision, judicial endorsement, and inter-institutional coherence. Enforcement risks becoming erratic without Commission guidelines on what constitutes a legitimate objective or a proportionate design choice in digital contexts.
Nonetheless, the promise is substantial. As competition law re-embraces fairness, not merely as an economic artefact but as a normative principle, Article 102(a) offers a unique avenue to tackle the behavioural manipulation and psychological extraction that undergird platform dominance. My claim is not that Witt makes this argument herself, but that her doctrinal framework provides a principled and intellectually defensible basis for doing so.
If adopted, this approach would mark a conceptual evolution, shifting from pricing excesses to exploitative design excesses, and from unfair terms to unfair interfaces. As dominant firms increasingly engineer consent, attention, and trust through non-transparent and coercive design, the revival of Article 102(a) may prove essential for restoring competition and safeguarding the integrity of user autonomy in the digital age.