nudge theory

From increasing organ donation registrations to more recent examples of boosting vaccination rates through prompts and reminders, nudge theory is still regarded as one of the chief tools within a behavioural scientist’s toolkit. And with the release of the revised version of Nudge, now is a better time than any to reflect on the behavioural change mechanism that quickly became the poster child for the field of behavioural science. But questions around the personalization of nudges and the ethics related to this have been on the tip of many tongues. So, what does the future of nudges look like, and will we still embrace them in years to come?

How did we get here?

For almost 15 years, nudging has been met with enthusiasm by individuals working across both the private and public sector. Characterized by their ability to target low hanging fruit, practitioners have used nudges to steer individuals towards a particular decision whilst simultaneously preserving their freedom of choice. This makes the “better” option more salient to individuals which hopefully leads to healthier, smarter and more financially stable citizens.

After first finding their footing within government processes, nudges or “nudge marketing” quickly started being used across commercial settings. Companies realized that they could craft personalized nudges by using existing customer-data to show individuals products and services that they already like or probably will like.  And in general, research shows that personalization is viewed positively especially within the commercial domain, a prime example would be Netflix’s recommendation algorithm of “Because you watched this…” or TikTok’s “For You Feed”.

But to make nudges personal, relevant and useful to individuals, companies need data to learn exactly who their customers are, how they differ from others and by how much. Personalized nudges mean that organizations can curate specific messages for different groups of a population or target audience, something that is becoming increasingly easy to do in the digitalized world that we now find ourselves living in.

How personal is too personal?

There is a fine line between a helpful nudge and one that feels like a complete invasion of privacy –  take for example when Target sent coupons to individuals that they thought were most likely to be pregnant by using customer data. Or worse, think back to the 2018 Facebook-Cambridge Analytica data scandal where personalized political advertisements were crafted based on intimate user data. Examples like these have sparked the growing discussion about the ethical application of nudges and data sharing in general.

It’s worth taking a step back to consider what is really happening in situations like these. Is nudging the problem here? Or is it really about data protection laws and how we engage in them? Or perhaps a mix of both?

Ethical nudging – let’s draw a line in the sand

To begin with, it’s important to keep in mind that individuals will always in some shape or form be nudged – neutral environments don’t exist and so one choice is always going to be more salient than others. The question isn’t “to nudge or not to nudge?” but rather how to present choices in a way that supports the core beliefs of nudge theory (i.e. how to make it easier for individuals to select a choice that is in their best interest). Clearly, nudges can be placed along a spectrum of ethics ranging from “dark nudges”, “sludges” to nudges themselves (using its original definition which makes it easier for individuals to engage in behaviours that align with their values). Currently, there are only best practice recommendations on how to ethically nudge. What we really need is for regulators to introduce clear guidelines and frameworks detailing how to ethically nudge individuals (e.g. nudges remaining transparent, easy to opt-out and so on).

Strip out the friction from privacy policies

Let’s help individuals actually understand where their data goes and how it is used.

In turn, personalized nudges need to be fed data that customers are okay with sharing. It comes as no surprise that individuals are more concerned about their data privacy than ever before. In fact when looking at British audience data in GWI, only 22% of British adults feel in control of their personal data when engaging in online activities and 40% worry about how companies use their data. Ultimately, individuals don’t know where their data goes or how organizations use it.

But we’re all provided with privacy policies that we accept before using social media and accessing websites, so surely we’ve given consent? Not quite. It’s a well-known fact that a minority of individuals actually read online privacy policies. We don’t need a study to prove to us that privacy policies are difficult to read – but there are studies that do indeed show this. One in particular found that it would take an individual 154 hours to skim read all the privacy policies for any new sites they visited per year. What’s more, it would take 244 hours per year if they were to read them properly – that’s 20 hours per month or almost 5 hours per week just reading privacy policies!

So where do we go from here? For starters, it’s clear that we need to make it easier for individuals to understand privacy policies. Companies also need to do a better job of explaining what data they collect and why they do so. Ultimately, this means that individuals can give informed and voluntary consent to their data being shared or processed by companies.

Looking forward

By introducing clear guidelines for ethical nudging and data sharing, ambiguity around the two can be diminished. The knock-on effect is twofold – leading to increased trust and positive sentiment towards brands which results in sustainable business.  Ultimately, this transparency is good for businesses, but more importantly customers too.   

Lauren Stephenson - Behavioural Analyst

Author:Lauren Stephenson - Behavioural Analyst

Lauren studied Psychology at the University of Cape Town and completed a masters in Behavioural Economics at the University of Bath. At behave, Lauren uses leading behavioural technologies to understand consumer attitudes, motivations and behaviours to close the gap between what individuals say they do, and what they actually do.