Dark Patterns: How Weaponized Usability Hurts UsersFresh Legislation Targets Deceptive, Privacy-Shredding Interface Design
"Dark patterns" are out to get you, but you may not even be able to see them.
What are dark patterns? The term might bring to mind the vampire-focused camp-Gothic soap opera of a similar name that was on TV decades ago.
This threat, however, is not only real but insidious, in that it weaponizes usability to trick users.
In the web world, usability refers to the practice of designing easy-to-use user interfaces.
But what if designers don't have a user's best interests at heart, but instead want to trick them? That's the shadowy push behind dark patterns, which information security veteran Wendy Nather defines as being "the abuse of what we have been conditioned to understand as being user interface rules."
Unfortunately, this type of abuse can be tough to spot, as Nather demonstrated by showing numerous example of such practices in a session at the 10th annual IRISSCERT Cyber Crime Conference in Dublin last November (see: Cybercrime Conference Returns to Dublin).
Tactics include "bait and switch" prompts where a user clicks one thing, only to have the unexpected happen - such as clicking an "X" to close a Microsoft prompt, only to have that result in the installation of Windows 10, she said. Misdirection is another common ploy, involving design that focuses users on a big, bold option, while making it difficult to see other options that might be more in line with what a user actually wants to do.
Senators Target Malicious Design
Now, efforts are afoot to banish dark patterns, which at their best trick users into doing things they wouldn't normally do, and at their worst can compromise the security of their personal data as well as their privacy.
On Tuesday, U.S. senators Mark Warner, D-Va., and Deb Fischer, R-Neb., introduced legislation called the Deceptive Experiences To Online Users Reduction Act.
A summary of the DETOUR bill draft says that it would "prohibit large online platforms from using deceptive user interfaces, known as 'dark patterns,' to trick consumers into handing over their personal data" as well as "promote consumer welfare in the use of behavioral research by such providers."
Warner, a former technology executive who's now vice chairman of the Senate Intelligence Committee, says the goal of the legislation is simple: Improve transparency to enable users to be better informed before they share their personal information.
"For years, social media platforms have been relying on all sorts of tricks and tools to convince users to hand over their personal data without really understanding what they are consenting to," he says. "Some of the most nefarious strategies rely on 'dark patterns' - deceptive interfaces and default settings, drawing on tricks of behavioral psychology, designed to undermine user autonomy and push consumers into doing things they wouldn't otherwise do, like hand over all of their personal data to be exploited for commercial purposes."
More and more consumers are being targeted with manipulative online practices that try to trick them into handing over personal information - and this problem is not going away on its own. That's why I've introduced a bill to protect user privacy and increase transparency online. pic.twitter.com/d8AnSkPBS0— Mark Warner (@MarkWarner) April 9, 2019
Detour Act Final by on Scribd
In a related legislative effort, Sen. Ed Markey, D-Mass., plans to introduce a bill on Thursday that would ban dark patterns in sites or services that target children, which he says are too often used for marketing purposes as well as to direct children to inappropriate websites.
Markey's bill, called the Kids Internet Design and Safety Act, would specifically ban "manipulative and damaging design features," restrict the types of advertising that can be used to target children as well as require organizations to create guidance for parents on "kid-healthy content," among other stipulations.
Just like there are rules for children's television, we need rules for kids on the internet. I'll be introducing the #KIDSAct to make sure children aren't being manipulated by dangerous features, inappropriately advertised to, or exposed to harmful content. pic.twitter.com/nDq4fd6Ena— Ed Markey (@SenMarkey) April 4, 2019
In Europe, meanwhile, any abuse of user interface design that leads to users inadvertently sharing personal information could be grounds for users to file a General Data Protection Regulation complaint with their country's data protection watchdog. Regulators could launch reviews of websites and fine organizations for using such practices to obtain private data.
"Thank goodness for GDPR, because it is causing people to look at what they have and why they have it," said Nather, who's head of advisory CISOs for Cisco's Duo Security group. "We need a willingness to correct mistakes, which is another problem we have with breach disclosure ... and we need empathy."
Guide to Spotting Evil Design
One challenge with dark patterns, however, can be detecting them. "The problem with malicious design is that it's not necessarily extra code; it's not malware," Nather said.
Drawing on the work of the Dark Patterns website, for example, Nather said at IRISSCON that such tactics may fall into multiple camps, including:
- Bait and switch: Users think they're clicking one type of prompt, only to be sent elsewhere. One example was Microsoft trying to entice Windows users to upgrade to Windows 10, at one point running a pop-up window that would begin the upgrade even if a user clicked the X in the upper-right side of the box, which should have closed the window.
- Confirmshaming: Using guilt to drive someone into accepting something, aka the "yes I want to let the orphans starve" dialog, Nather said. Such tactics often get used to get users to sign up for mailing lists.
- Disguised ads: Content that appears to be legitimate, but which is really disguised display advertising, designed to get users to click on it.
- Misdirection: Purposefully focusing a user on one thing so they don't see another.
- Roach motel: The design of a site makes it easy for a user to end up in a situation but difficult to get out. Ticketmaster, for example, has been known to sneak magazine subscriptions into a customer's shopping cart unless they explicitly opt out.
- Trick questions: Trick phrasing leaves users unclear about whether clicking "no" might mean "yes," or follows a checkbox for opting out with a checkbox for opting in.
Eh Tu, Facebook?
Privacy expert Ashkan Soltani, who previously served as the Federal Trade Commission's CTO, says Facebook is a frequent dark patterns offender.
2) April 2019: @Facebook prompts users to enter their username/password to their external email accounts for security reasons, then -- as confirmed by @EFF, uses that information to download contacts (address book) without consent. #ftchearinghttps://t.co/3zETnqGFab pic.twitter.com/Ece1XjAqf9— ashkan soltani (@ashk4n) April 9, 2019
Earlier this month, for example, multiple news outlets reported that as part of its new-account "confirmation flow," Facebook was requiring users who signed up for the social network using an email address from one of several providers - including AOL, Comcast, Hotmail, GMX, Yahoo and Yandex - to give Facebook the password to their email account.
Digital rights group Electronic Frontier Foundation says users should never share the password for one service with another service. And using a dummy account to study the Facebook confirmation flow, it found that after users shared the password, Facebook began harvesting their contacts on the webmail service without asking first.
EFF reported: "When we clicked 'Connect to yandex.com,' an overlay with a status bar appeared. 'Authenticating,' it said. But wait - 'Importing contacts?' When did that happen? What? How? Why??"
Following negative press, EFF says Facebook appears to have stopped harvesting contacts in this manner.
Facebook didn't immediately reply to a request for comment.
Good Design: Security Upsides
In the information security realm, good usability can forestall a host of security problems by making it easier and more intuitive to use everything from passwords and two-factor authentication to anti-phishing tools and the security features built into operating systems.
Nather used utensils as a simile for the benefit of good design: "Security should be as easy to use as a spoon - after you're a toddler, you rarely pick it up wrong."
For designers, however, crafting things that seem simple to use can be deceptively difficult, requiring intense time and labor.
Many have lauded numerous Apple products, such as the iPhone, because they seem intuitive to use. The same goes for many leading apps or websites, ranging from eBay to ESPN to Zillow.
Good, usable design is inevitably the result of an untold amount of iteration and testing.
Unfortunately, some organizations seem to have been devoting similar resources to honing their dark patterns.
But organizations that use usability as a force for good have the opportunity to not only foster better data security, but also greater trust with users, Nather said, especially if they do so in an open, transparent and ongoing manner.