Curated Resource ( ? )

Dark patterns versus behavioural nudges in UX - Where do we draw the ethical line when designing choices?

Dark patterns versus behavioural nudges in UX - Where do we draw the ethical line when designing choices?

my notes ( ? )

Introduction

There’s a powerful force that flows through every user interface (UI): psychology.

From websites and mobile apps to bleeding edge extended reality (XR) devices, it’s psychology that defines user experience — and what UX practitioners will always design for.

But like a certain force from a galaxy far, far away, psychology can be used for good or evil. On the light side we have behavioural nudges; on the dark side we have, well, dark patterns.

At least, that’s the clean and simple explanation.

In reality, things are a bit fuzzier. And we’ll get to that. But first, let’s start by looking at the difference between dark patterns and behavioural nudges — and why one is considered ethical and the other underhanded.

What are dark patterns?

Dark patterns are deceptive techniques designed to manipulate user behaviour.

Welcome to the dark arts of UI design. There are at least 20 identified dark patterns that are used to trick us into decisions we probably don’t want to make — ranging from the mildly annoying to the downright egregious.

Here are some examples of dark patterns that you’ve probably come across:

  • Disguised Ad — content designed to look like independent editorial is actually paid promotion of a product or service.
  • Fake Urgency — an artificial time limit or misleading availability that manipulates you into purchasing quicker.
  • Hard Opt-Out — finding it easy to start a subscription, but you go around in circles working out how to cancel (also known as the ‘Roach Motel’).
  • Hidden Costs — you get lured in by an attractive offer, but then find extra fees and charges added that you weren’t expecting.
  • Bad Default — an option is recommended or pre-selected for you that doesn’t represent best value or your best interests.

Deceptive patterns go by different names, but make no mistake about it: these techniques are the dark side of the force: they cause annoyance, frustration and anger, leaving people feeling manipulated and borderline scammed out of money.

What are behavioural nudges?

Behavioural nudges are psychological effects used to influence user choices.

There are some interchangeable terms for nudging — some call it behavioural science or insights — but they all amount to the same goal: behaviour change.

You can find different models and playbooks explaining how to do behaviour change, but personally I like the MINDSPACE framework: it’s a clear, intuitive system that identifies nine psychological effects which can be used to influence human behaviour.

Here’s an accessible summary of each psychological effect in the MINDSPACE framework:

  • Messenger — we’re influenced by who’s communicating information (trust and authority being key factors).
  • Incentives — we’re motivated by a strong desire to avoid losses (more than acquiring gains).
  • Norms — we’re heavily influenced by what others do (perhaps following the ‘wisdom of the crowd’, or simply seeking to fit in).
  • Defaults — we ‘go with the flow’ of pre-set options (like highlighted recommendations).
  • Salience — our attention is drawn to things that seem relevant to us (such as our location).
  • Priming — our actions are often influenced by subconscious cues.
  • Affect — our actions can be shaped by emotional associations.
  • Commitments — we seek to be consistent with our public promises (and to reciprocate actions).
  • Ego — we act in ways that make us feel better about ourselves.

Examples of MINDSPACE behavioural nudges include phrases like ‘Most people apply online’ on a government website (norm) or being asked to share a health or environment pledge on social media when engaging with a campaign (commitment).

A famous behavioural nudge study found that people were most likely to join the UK Organ Donor Register after seeing a message that drew on reciprocity: “If you needed an organ transplant would you have one?” This motivates people to do right by others because they’d want others to do the same for them. (It’s interesting the most effective nudge plays upon self-interest.)

MINDSPACE effects can also be combined for a greater impact, e.g. being told by a trusted brand (messenger) that most people (norm) who stay in your hotel room (salience) reuse their towels to reduce their impact on the environment (ego). You can experiment with this mix-and-match approach using multivariate analysis methods.

Behavioural nudges are used by policymakers to influence individual and social behaviour change (like quitting smoking or reducing littering). But they’re also used by companies to influence consumer purchasing decisions. For example, we’re more likely to buy a discounted product because we’ve been primed to the higher ‘normal’ price which makes the discount look attractive.

Okay, so what’s the difference?

You’ll notice that the underlying mechanisms of dark patterns and behavioural nudges are not light-years apart: they both use psychology as a force to affect user behaviour, taking advantage of how we subconsciously use mental shortcuts to make choices.

But what makes behavioural nudge practitioners ethical Jedis and dark pattern disciples devious Sith Lords? After all, aren’t all of these effects sneaky mind tricks — why aren’t nudges counted as deceptions if they’re interfering with our free choices?

The main reason why dark patterns are considered unethical and nudges aren’t boils down to intent. To understand intent in this context, we need to consider

Chris Kiess’s three categories of UX ethics:

  • Existential values — your personal values as a designer (which may automatically preclude you from working in a harmful industry).
  • Ill or misdirected intent — this is where user needs are placed below other priorities, like business goals.
  • Benevolent intent — the ideal state to aim for, which prioritises the user’s needs and best interests.

Dark patterns and intent

Dark patterns represent ill or misdirected intent: the design does not prioritise the user’s needs or interests in any way. Instead, it’s motivated entirely by other goals — usually company profits.

It’s important to note that influencing consumer choices doesn’t automatically make something a dark pattern. It’s absolutely possible to achieve an ethical balance where businesses are persuasive, but still prioritise a usable, helpful and satisfying experience for the customer.

It’s when design crosses the line from persuasion to cynical manipulation, tricking customers into actions that go against their interests or intentions, that a pattern is considered dark.

Behavioural nudges and intent

Behavioural nudges, on the other hand, are typically motivated by benevolent intent through the philosophy of paternal libertarianism: the idea that users should always have the autonomy to make the choices they want, but designers should seek to influence choices in the best interests of the user and wider society, e.g. nudging us to be healthier, safer and more environmentally sustainable.

While the motivations of a business are obviously not ‘pure’ benevolence compared to a health service trying to increase organ donors, appropriately designed ecommerce websites and apps that provide a good UX still fall under the umbrella of behavioural nudges rather than dark patterns.

So in a nutshell, you could say the ethical line between dark patterns and behavioural nudges works like this:

Dark patterns are unethical because they cynically manipulate and trick us with ill intent; behavioural nudges are ethical because they seek to influence us through benevolent intent.

The uncomfortable grey areas

One of the appealing things about Star Wars is that it’s a straightforward story of good and evil. Black and white morality is reassuring. Comforting. Unfortunately, the real world is less binary, and the galaxy of user experience is more complex than evil empires and benevolent UX wizards.

For example, if a social media site makes it difficult to delete your account, is that a Hard Opt-Out dark pattern or an Incentive behavioural nudge to help us avoid the loss of everything we’ve built up over the years on that platform — the photos, videos, connections, memories, etc?

Another example: charities. When a charity pre-selects a donation amount/frequency is that a Default behavioural nudge or a Bad Default dark pattern? What about when a charity uses highly emotive imagery to encourage donations — is that a good Affect nudge or a Guilt Tripping dark pattern?

In many cases, the intent behind design decisions isn’t clear-cut, and as a result the line between persuasion and manipulation is often blurred in UX. Is a company highlighting their ‘Most popular’ (paid) package helpfully persuading you or cynically manipulating you? Is it a nudge or a dark pattern? It’s open to interpretation.

Another consideration is that some businesses may promote positive behaviour change which appears benevolent but is actually self-serving. For example, the ‘When the fun stops, stop’ GambleAware campaign had little effect, but probably improved the public perception of an industry that profits from addiction. How do we reconcile pro-social nudges which paper over unethical practices?

Finally, we should also consider that not all dark pattern practitioners are gleefully deploying Death Stars of deception across the web. Many designers with the best of intentions are restricted in their capacity to act ethically due to commercial pressures, and end up manipulating users. Real people are flawed, and are neither the heroes or villains of morality tales.

The illusion of neutrality

You might be wondering: Isn’t the most ethical thing to simply avoid influencing people’s choices at all?.

Well, consider this hypothetical: if you design the layout of a canteen, you can either decide to put the food that’s healthiest and most sustainable at the start (nudging people to choose meals that are better for them and the environment) or at the end, or some other permutation. All of these choices are active decisions — even the ‘default’ layout. That means as the designer there is no ethnically neutral option, because you’re always making a decision one way or the other that affects people.

Also, remember that humans are lazy, and that avoiding nudges means users have to expend time and energy — which are precious resources — on every trivial decision. This isn’t helpful. People don’t want to think about literally every choice. For a lot of things, we’re prepared to outsource some of that decision-making; we want someone else to make the choice for us.

For example, when I’m downloading some new software, I don’t want to spend ages working out what initial settings to start with. I want the developer to recommend settings to use out of the box — I can always adjust them later.

Of course, there are scenarios where the appropriate thing to do is take a ‘ballot paper’ approach, and avoid trying to influence people’s choices as far as practically possible. But the point is that in many situations not only is staying neutral impossible as a designer, it’s not actually the right thing to do. In fact, in some situations people are often so happy to be influenced that nudges can still work on us even when we’re told we’re being nudged.

Key takeaways

  • Dark patterns cynically manipulate and trick users into decisions that benefit others.
  • Behavioural nudges influence users to make better choices for themselves and wider society while still respecting user autonomy — the concept of paternal libertarianism.
  • Both dark patterns and nudges harness psychology, but the crucial difference comes down to intent — dark patterns are motivated by ill or misdirected intent, whereas nudges aim for benevolent intent (prioritising user needs).
  • To be an ethical practitioner in UX, start with a foundation of good existential values. (Have a North Star guiding your design decisions — not a Death Star.)
  • It’s important to recognise that there are grey areas — whether something is considered a nudge or dark pattern can be subjective.
  • Remember there’s often no neutral option — every design decision is an active choice to act in the user’s best interests or against them.

Sources

Brignull, H. (2023) Deceptive patterns: exposing the tricks tech companies use to control you. 1st edn. Eastbourne: Testimonium Ltd.

Cara, C. (2019) ‘Dark patterns in the media: a systematic review’, Network Intelligence Studies, 7(14), pp. 105-113. Available at: https://www.researchgate.net/publication/341105338_dark_patterns_in_the_media_a_systematic_review (Accessed: 2 April 2024).

Chamorro, L. S., Bongard-Blanchy, K. and Koenig, V. (2023) ‘Ethical tensions in ux design practice: exploring the fine line between persuasion and manipulation in online interfaces’. DIS '23: Proceedings of the 2023 ACM Designing Interactive Systems Conference, pp. 2408-2422. Available at: https://doi.org/10.1145/3563657.3596013

Monteiro, M. (2017) A designer’s code of ethics. Available at: https://deardesignstudent.com/a-designers-code-of-ethics-f4a88aca9e95 (Accessed: 2 April 2024).

Sapio, D. (2020) 10 principles for ethical ux designs. Available at: https://uxdesign.cc/10-principles-for-ethical-ux-designs-21faf5ab243d (Accessed: 2 April 2024)

Singer, P. (2024) Ethics: origins, history, theories, & applications. Available at: https://www.britannica.com/topic/ethics-philosophy (Accessed: 2 April 2024).

Suresh, I. (2024) Ethical considerations in ux design. Available at: https://bootcamp.uxdesign.cc/ethical-considerations-in-ux-design-10ef2d089e85 (Accessed: 2 April 2024).

About the author

Andrew Tipp is a lead content designer and digital UX professional. He works in local government for Suffolk County Council, where he manages a content design team. You can follow him on Medium and connect on LinkedIn.

Read the Full Post

The above notes were curated from the full post uxplanet.org/dark-patterns-versus-behavioural-nudges-in-ux-e79633970b3f.

Related reading

More Stuff I Like

More Stuff tagged dark patterns , design patterns , nudging

Cookies disclaimer

MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.