Excerpt
Deceptive designs, often called dark patterns, are everywhere. Deceptive designs are designs of user interfaces that manipulate, confuse, disorient, or deceive individuals to make decisions to surrender personal data, subscribe to services, or purchase goods. According to the 2024 Digital Fairness Fitness Check, manipulative designs cost consumers in the European Union 7.9 billion euros per year. A recent study demonstrated that ninety-seven of the most popular consumer apps in the European Union deployed deceptive designs. While no similar calculations are available for the United States, due to lesser legal regulation, it is reasonable to assume that the figures must be staggering.
Crucially, deceptive designs not only produce economic harms. Based on large-scale data extraction and hidden psychometric profiling, micro-targeting, and hyper-nudging, deceptive designs actively undercut individuals’ ability to think, to protect their mental privacy, or shape their own personalities. So far, most legal remedies have been in downstream consumer protection, which are ill-suited to address the systemic, ubiquitous, and inescapable nature of rampant surveillance capitalism. Yet, there is the potential for an upstream solution in the human rights doctrine which would enable individuals to reclaim their epistemic rights and ultimately safeguard the essence of their humanity in the face of unprecedented technological surveillance and manipulation.