A photo of four students in a group, smiling and looking happy

Digital Citizenship Week is October 14–18!

Join thousands of teachers and students worldwide and celebrate in your classroom!

by Johanna Gunawan, Privacy Summer Intern

TOPICS

Manipulative design, otherwise known as "dark patterns" according to Harry Brignull, are "tricks used in websites and apps that make you do things that you didn't mean to, like buying or signing up for something." Other researchers define them more loosely as designs that use behavioral knowledge against a user's best interests or that adversely impact their decisions.

An FTC workshop in early 2021 covered a vast range of important manipulative design topics, including why they might arise, what consequences they might have, and how they might impact teens and kids. A manipulative design is not necessarily intentional; in many cases, designers follow industry norms or make design decisions based on other priorities, without realizing potential adverse effects. If you've used the internet anytime recently, you've likely encountered a manipulative design and perhaps even recognized some -- researchers discovered at least one manipulative design in 95% of apps in an experiment, while others observed that people are often aware of how interfaces manipulate them but still succumb to the designs.

What Should You Know About Manipulative Design? 

Listing every possible manipulative design on the web or in apps would be an incredibly difficult task, and memorizing all of them in order to protect your digital experience is perhaps impossible. Researchers have begun to measure and categorize patterns, but capturing every possible example and enumerating them is challenging, considering the fast pace of technological changes across industries, as well as within one app. 

What's most important to know is that manipulative design uses people's behavior against them. One way to think about this is to consider what's at risk if you fall susceptible to a specific design. Do you lose time, money, privacy, or something else? Do you end up providing more data than you normally would have, or do you end up paying more fees? Some manipulative designs force your attention away from your current task and toward something the service wants you to look at or do. Other patterns attempt to get you to agree with or select an option that benefits the app creator, but can lead to the overcollection of your data, or perhaps add easy-to-miss items to your shopping cart. In the offline world, manipulative design can make it difficult for you to compare prices of items in a supermarket, leading you to select what seems like a deal but is actually more expensive.

We deal with behavioral tricks, called nudges, in both physical and digital spaces every day. Some of these can be helpful, like placing healthier food options at eye level on a supermarket shelf, or within easier reach than less healthful options. Nudges facilitate decision-making by reducing the effort or friction required to evaluate all possible options. Manipulative design can be seen as nudges in the "wrong" direction: away from your interests, and toward someone else's. Know that manipulative design can be found all around you, and that you've likely encountered some already. 

Why Do Manipulative Design Matter for Privacy, and Privacy Policies?

Our Privacy Evaluations help parents and educators better understand the risks of using a certain app or educational technology. We firmly believe that transparent policies, whether these disclose good or bad data practices, empower people and help them make better decisions about what products to use. 

Manipulative design can interfere with the promises set out in a privacy policy, potentially misrepresenting the controls or agency an app user may really have while interacting with the technology. In the wake of privacy regulations like Europe's GDPR, many vendors began to improve how they articulated their data practices and promises, but manipulative design can often circumvent these promises by making it more difficult to find privacy-forward features, or by over-encouraging behaviors in a service that erode privacy. When manipulative design are employed to push people toward accepting terms and conditions they don't read, or toward accepting all cookies over minimal cookies and other types of decisions that trap users into data collection, people lose the ability to make truly informed decisions. 

Such manipulative design reflect design strategies that compromise privacy, called "dark strategies" by some researchers. These strategies are the mechanisms by which manipulative design trick users, and they are often directly at odds with privacy-forward design strategies and the Privacy by Design principles outlined in the GDPR. For example, privacy manipulative design can employ strategies that encourage data maximization (like oversharing or overcollection), which counters the more protective data minimization principle. Other manipulative design like hidden legalese stipulations obscure information from people, rather than helping them learn about what's being done to their data or otherwise informing them.

Transparency is necessary for informed decision-making, which is why it's such an integral part of our evaluation process and question set. Manipulative design interfere with transparency at the user interaction level, which undermines vendors' efforts to improve their privacy policy disclosures, and ultimately disadvantages the kids and educators using these technologies. Even so, transparent policies do not guarantee that the app design itself is free from these manipulative tactics. We need both good policies and good practices.

What Can Individuals Do About Manipulative Design? 

Avoiding manipulative design as an average internet user can seem equally as impossible as regulating them. Recognition is one part of fighting back against manipulative design: When you're better able to recognize manipulative design and the mechanisms behind them, you're better able to make informed decisions for your internet use. Manipulative design are likely to be found in every type of interface, whether that be in a website, an app, a VR headset, or other smart devices. 

Learn More to Build Awareness

To start learning about manipulative design, you can visit the resources provided in this article, beginning with darkpatterns.org. It can be useful to learn more about the evolution of manipulative design through concepts like nudges; resources come from fields like law, cognitive psychology, human-computer interaction/computer science, philosophy, and more. More immediately, concerned parents can and should sample their kids' technologies with an eye out for potential traps that are especially attractive to kids and teens; listening to the FTC panel on how manipulative design impact youth can be an excellent starting point for manipulative design awareness. For more examples of recent manipulative design, this Twitter account run by Harry Brignull frequently retweets examples found by other consumers and provides an up-to-date look at patterns you may have already encountered. 

Vendors can also benefit from these resources to help identify designs that unintentionally harm the user. Designers of apps and technologies can keep up to date with the evolving discourse in UX design and human-computer interaction by digging deeper into interaction design's potential pitfalls and considering how their work fits into value-sensitive or ethical design. 

Contribute to Manipulative Design Identification

Another way to contribute to a collective understanding of manipulative design is to report manipulative design you find in your daily life to the Dark Patterns Tip Line, run by Consumer Reports. Building a robust, real-world body of examples is integral to providing policymakers with the knowledge they need to understand manipulative design and draft appropriate regulations or set industry standards to reduce their presence and potential for harm in everyday technologies. 

Support Regulatory Efforts

Regulating design decisions is a daunting task, and often a contested one. In an environment where counting eyeballs serves as revenue, companies are incentivized to continue designing applications and services that increase user engagement, shopping, and monetization. Similarly, the same companies must continue innovating and pushing design boundaries -- so what can be done to ensure that businesses survive while refraining from burdening users? 


Researchers, lawmakers, nonprofits, and other advocacy groups are working on initiatives to help regulate manipulative design. The California Consumer Privacy Act (CCPA) was recently updated to include a provision that specifically prohibits dark patterns and manipulative design that include a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision‐making, or choice. Other federal bills have been proposed to improve the current state of manipulative designs, like the SMART and DETOUR Acts. Some groups have submitted comments to the FTC specifically on the topic of children and dark patterns, including Common Sense Media's comments on the benefits of digital citizenship education to combat children's susceptibility to manipulative design. By learning more about manipulative design and letting your representatives and advocacy groups know that you're concerned about manipulative design, these efforts can continue to build momentum toward prohibiting the use of manipulative design that undermine our privacy and freedom of choice.

Jill Bronfman

Jill Bronfman, served as Privacy Counsel for Common Sense. She taught law, graduate, and undergraduate students.