A dark pattern is a user interface that has been intentionally designed to deceive or mislead users. Examples of dark patterns include hidden opt-out buttons, misleading language, and fake countdown timers. These design patterns are often used to trick users into signing up for services they don’t want or revealing personal information. The goal of dark patterns is to mislead users into performing actions they might not otherwise do.
In software engineering, in addition to dark patterns, we also come across flawed design concepts like anti-patterns. These appear as solutions to problems but are usually ineffective and may result in other problems. Anti-patterns are not limited to software development but can also appear in other fields.
Dark patterns versus anti-patterns
Dark patterns are intentionally designed to deceive the user and benefit of the designer or the business. On the other hand, anti-patterns are often the result of lack of knowledge or experience. They can cause problems for the user or the system. However, anti-patterns are not necessarily malicious. The key difference between dark pattern and anti-pattern is in the intent of their creation.
Common causes of anti-patterns in software development include:
- Practice of cutting and pasting code instead of creating reusable functions
- Use of a relational database to store hierarchical data
- Excessive use of global variables
- Use of comments to explain bad code
- Over-engineering a solution intended to solve a simple problem
Anti-patterns can also appear in organizational, business and architectural designs due. This may happen due to lack of:
- Clear definition of task completion
- Clear vision or goal
- Communication and coordination between teams
- Clear decision-making process
- Clear ownership or accountability
Common types of dark patterns
- Misdirection: The purpose of misdirection is to direct the user’s attention away from important or relevant information. An example of this would be a website that prominently displays a ‘Sign Up’ button but makes it difficult to find the ‘Cancel’ button.
- Disguised ads: This type of dark pattern is used to disguise advertisements as a part of the content or interface. For example, a website that appears similar to a search engine but instead displays advertisements.
- Trick questions/misinformation: As the name suggests, this technique is used to trick the user into answering questions in a certain way. A survey that starts with easy questions and then gradually becomes more personal or sensitive best fits this description.
- Forced continuity: This is usually the modus operandi of subscription-based websites, which force the user to take a certain action. Typical examples include automatically signing the user up for a service without their knowledge or consent.
- Bait-and-switch: This pattern is used to entice the user to take a certain action but instead produces an undesirable action or different outcome. An example would be a website offering a free trial but automatically charging the user’s credit card after the trial period ends.
- Sneak into basket: This trick is used to hide or obscure crucial information or actions. Examples include pre-checked boxes, fine print, and hidden fees.
- Pressure: Websites or apps employ this tactic to create a sense of urgency and force the user to take certain action. Examples include fake urgency messages, scarcity, and limited time offers.
- Confirm-shaming: Making users feel guilty or ashamed for not taking a certain action is typical of confirm-shaming. Examples include flashing messages such as, ‘Are you sure you want to leave?’ or ‘Don’t you want to support our cause?’
- Obstruction: Deliberately making it difficult for users to find or use certain information or features is obstruction dark pattern in action. Examples include hiding navigation menus, disabling the Back button, and redirecting users to different pages.
- Friend spam/growth hacking through spamming: Asking for access to a user’s contacts list or social media account and then sending messages or invites on their behalf is classified as Friend spam. Many apps have this dark pattern design built into them.
- Hidden costs: This dark pattern is used to hide or obscure the actual cost of a product or service and trick the user into thinking that it is cheaper or more affordable than other available options. An example is when the price displayed for a product or a service increases manifold (taxes and delivery fees) once the user moves ahead with the checkout.
- Price comparison prevention/obscured pricing: This is used to prevent users from comparing prices or shopping for the best deal. Techniques used include hiding the price of the product, making it difficult to compare prices, and using misleading language to make it seem like a product or service is a better deal than it really is. The goal of this dark pattern is to trick the user into purchasing a product or service without considering other options.
- Privacy zuckering: This dark pattern is named after Mark Zuckerberg, the CEO of Facebook, who has been criticized for his company’s privacy practices. Zuckering is used to trick the user into sharing personal information, such as their location, contacts, or browsing history, without their knowledge or consent. It is often accomplished by using misleading language, confusing settings, or hiding important information in the fine print.
- Roach motel: When used, this strategy makes it difficult or impossible for users to unsubscribe or cancel a service. Thus, it keeps users subscribed to a service for as long as possible, even if they no longer want it. Techniques usually used to accomplish this include hiding the unsubscribe link, using a confirmation opt-out, automatic renewals, or offering a ‘free trial’ with the automatic subscription.
- Asking more than intended: Asking for personal information, more than what an individual intends to share, is the biggest reason for losing the trust of a user/customer. This dark UX pattern exists online in the name of ‘knowing your users.’
- Triggering fear: A typical example of this is asking a user to not opt out of a subscription or a feature selection as it can lead to negative consequences. For instance, Facebook relies on ‘intrusive default settings’ and ‘misleading wordings’ where users are warned to not disable the ‘facial recognition’ feature as it can lead to someone else impersonating them.
- Social proof: In this technique, success stories of users (paid or in-house members) are shared to influence a user’s actions and behavior. Brands are often seen promoting such content on websites and social media to garner more visitors and increase sales.
- Triggering FOMO (Fear of Missing Out): Most eCommerce websites resort to this practice to increase orders from customers. Users are shown captions such as ‘Only a few left’ to trigger a purchase action.
While these are a few common dark patterns that we usually come across, new and more effective variations are being introduced. So, as a user, it is important to be aware of the signs of dark patterns and exercise caution when sharing personal information or making decisions online.
To learn more about dark patterns, click here for Part II.