Have you ever booked a holiday and felt the pressure of seeing how many other people are looking at it? Spent an hour working out how to unsubscribe from a newsletter? Or been made to feel guilty about not signing up to a website’s feature (also known as ‘confirm shaming’)?
That means you’ve experienced dark patterns, which are defined as “tricks used in websites and apps that make you do things that you didn’t mean to”. In short, they favour the commercial interests of the company rather than the consumer, and can often be misleading.
“Am I being a tin foil hat person?”
The term was coined by user experience (UX) specialist Harry Brignull in 2010. Brignull, who is now head of UX innovation at Smart Pension, tells Design Week that things have changed since he put a name to these tricks a decade ago.
“While I was doing it, I thought I was reaching for it,” Brignull recalls. “I thought: am I being a tin foil hat person? Am I creating a false narrative?”
The narrative, it turned out, was not false. Dark patterns exist. And when identified, and acted upon, they can cause real legal trouble for companies. In 2016, employment social media website LinkedIn agreed to pay $13m in a class action lawsuit because of its ‘Add Connections’ feature.
The website was accused of accessing users’ accounts without their permission and sending email invitations to contacts in their address books. The court found that while LinkedIn members consented to importing their contacts, they did not consent to the two additional “reminder emails” that the company sent about those requests.
Brignull says that often when you look at a company that might seem a “bit unethical” and look at their design principles — “if they have any” — what you usually find is an “utter contradiction”. Like LinkedIn, Facebook has also encountered legal issues (and financial) consequences from these practices. In 2018, the Norwegian Consumer Council published a report entitled Deceived by Design, highlight the way that Facebook, Google and Microsoft use employ “nudging to exploitation through dark patterns”.
‘Roach Motel’ and ‘Forced Continuity’
There is now a dark patterns website, explaining different types of deceptive design. Roach Motel is when “you get into a situation very easily, but then you find it is hard to get out of it”. Forced Continuity meanwhile is when “your free trial with a service comes to an end and your credit card silently starts getting charged without any warning”. Evocative branding abounds in the movement, and the shadowy term Dark Patterns has won out over various attempts of naming, including Nobel prize winner Richard Thaler’s suggestion of ‘sludge’.
Users can submit examples to the dedicated Dark Patterns Twitter account, from banking start-ups, mobile networks to huge tech companies. Brignull does say that it’s important not to lump all these together. We need to “retain clarity”, he says. Some design elements may be “manipulative” but they’re “mainly annoying”. For it to qualify as a dark pattern, it “must have an outcome of harm”. That’s a point he has had to clarify for legal audiences during his consultancy work.
— Kim Gillick (@KimberlyMicado) November 6, 2019
Could dark patterns get darker?
As well as increased awareness, another change since the term was defined is the extent to which technology is now embedded in our lives. Brignull says that he realised the extent of this dependence when he was brushing his teeth, and his Face ID wasn’t working on his phone. That was one moment of his day that didn’t need to be filled – but smart phones have filled those gaps in our lives.
Where does that reliance mean for dark patterns? The prevalence of apps means that dark patterns — it’s formed a new term: dark UX. “A lot of digital products rely on you being somewhat addicted to it,” Brignull says. “They use the same sorts of tricks as dark patterns to achieve that.”
Ross Sleight is chief strategy officer at Somo, a digital consultancy which works on product and experience design for clients like HSBC, Audi and Vodafone with a focus on mobile technology. He says that one of the problems is how “personal” mobiles are. “We run our lives from them.”
One of the biggest dark patterns is the infinite scroll, Sleight says. That’s had a huge impact on how we use apps in particular. It’s also “probably a very good design pattern; you can read a lot of information very quickly,” he says. But it comes with consequences: “we end up not reading things fully”.
On a behavioural level, Sleight questions the effect it has on users’ wellbeing — what happens, for example, when we don’t get the number of likes we want on an Instagram post? He adds: “These can all be seen as negative to customers. They’re issues that we as designers have to take into account now.”
These kinds of patterns have a different kind of impact, more behavioural and with less clear commercial outcomes for companies. But they’re driven by the same kind of metric: success to social networks is the amount of time users spend on the platform. “It’s not the dark pattern that’s the issue,” Sleight says. “It’s the business model that drives that dark pattern.”
More “humane” uses of technology
Once dark patterns have been identified — and properly delineated — what do we do about them? Brignull says it would be hard for designers to self-regulate as often they hold junior positions. Many industries have regulating bodies, Brignull says, pointing to the supermarket, financial and meat industry. The Financial Conduct Authority’s principles on the fair treatment of customers is a “really good form of regulation”.
This model is useful because it’s more flexible. “It’s about outcomes for designers,” Brignull says, “instead of simply: don’t do this.” “You can adapt what you’re doing around the principles, which are squishy and relatively timeless, whereas design features — like confirm shaming — change at a rapid pace”. (Another problem is that these design features are often widely applied, even if they aren’t appropriate a specific service.)
Both Brignull and Sleight mention the Center for Humane Technology, which was founded in 2016 by Tristan Harris, a design ethicist at Google. Through pressure groups, thought leadership and ‘inspiration’, the group aims to instil a sense of well-being into technology. At a senate hearing in January, Harris talks about his experience working at the Stanford Persuasive Technology lab with the founders of Instagram: “I know the culture of the people who build these products and the way they’re intentionally designed for mass deception.” He argued that design doesn’t just have dark patterns, but “dark infrastructure”.
What can designers do?
“Dark infrastructure” might sound a little overwhelming — and could highlight Brignull’s concern of identifying any annoying feature as a dark pattern. Within the movement, there is a focus on what designers can do to promote more “ethical” principles. UX Collective, for example, curates stories about UX, visual and product design, and is written by designers themselves. One article listing ethical design principle list features Airbnb’s warning about bookings that don’t have carbon monoxide detectors.
UX Collective’s founder, Fabricio Teixeira, tells Design Week that one of the problems is that the industry is “obsessed with metrics”. Some of which require a rethinking. As Teixeria points out, do traditional metrics like time spent on a site make sense for an ecommerce website? And while Brignull believes that the changes have to come from the top down, Teixeira does suggest it could be worth “convincing C-levels” by “illustrating the risk of negative PR”. Part of the change might also involve “designing our own tech diet”, he says, such as muting notifications on phones, deleting apps and limiting the amount of time we spend on our phones.
As well as “anticipating unhealthy behaviours” and publicly shaming companies that do use dark patterns, it also comes down to respect, Teixeira says. “When designing, it’s about making sure we’re choosing respectful design patterns that won’t create rabbit holes for users,” he adds.