fbpx
Skip to content

Nudge, sludge, and test before you judge: ANZSOG’s The Bridge webinar explores the ethics of using behavioural insights

5 November 2025

News and media

Share

Humility, engagement with the community and a sense of public value are vital traits for public servants looking to set ethical boundaries for the use of Behavioural Insights (BI). 

A live panel, brought together as part of ANZSOG’s research translation project The Bridge, looked at how BI should, and are being used, how to clear away the ‘sludge’  that stops people using government services and the possible impact of AI. 

The Bridge’s curator Maria Katsonis introduced the debate by outlining how Behavioural Insights, also known as Nudge Theory, draws on different strands of psychology, behavioural insights, and social science to understand how people actually make decisions. 

“It’s been used everywhere – from healthier eating to tax payments to energy conservation. The debate has been evolving about what it means to use behavioural insights, and ‘nudges’ ethically in relation to autonomy, transparency and also fairness,” she said.  

The panel represented academia and current public sector practice, consisting of:  

  • David Trudinger, Director, NSW Government Behavioural Insights Unit 
  • Dr Sarah Ball, Lecturer at the University of Queensland in Public Policy 
  • Dr Matthew Davies, Principal Advisor, New Zealand Justice Sector.  

Why has nudging gained such traction in public policy?

Mr Trudinger said that human behaviour was at the core of the business of government because governments have always been seeking to shape and influence human behaviour – through techniques including coercion, force, shame, morality, persuasion. 

“There has been at least 100 years, if not more, of governments engaging with researchers, practitioners, and knowledge creators around psychology and economics. So that continuity I think is really important because it reminds us that a conversation about behavioural insights is a conversation about public policy,” he said.  

“When people ask me about behavioural insights, I talk about it in terms of the purpose being to make behaviour visible. That gets you away from saying there’s a particular solution you have to have in mind, and helps us see a bit more clearly what questions we might ask.” 

Dr Ball said that BI was an incredibly humble approach to policy, based on saying ‘we don’t know and we should find out’. 

“One of the things that encouraged me to start working in this space was exactly that very human-centred approach to policy from the bottom up. It’s a very effective sort of tool. But like all policy instruments, how it’s used really matters,” she said. 

Mr Davies said that Aotearoa New Zealand has a slightly different model to a lot of other countries where, rather than a centralised BI function in a cabinet office or equivalent, there smaller BI functions within individual agencies that have grown organically over time. 

“A lot of behavioural science grounded in literature that has come from ‘WEIRD’ (Western, Educated, Industrialised, Rich and Democratic) populations who look very different to the sorts of populations that governments should be thinking about in New Zealand,” he said. 

“In the justice context, we know, for example, that Māori are massively over-represented in all of the justice statistics that we see, and so it’s really incumbent on us to think about the implications of any of our work for Māori. That’s not just about co-designing solutions, it’s about doing the step before and co-deciding with communities which behaviours we should be focusing on.”  

When does ‘nudging’ become manipulating?

While the use of nudging does not compel citizens, the structure of certain nudges verges on manipulation, leading to a long-standing debate on the ethics of nudging. 

Dr Ball outlined a typology of three kinds of nudges, first developed by Professor Robert Baldwin: 

  • ‘First Degree nudges’, such as simple warnings or reminders which respect the decision-making autonomy of the individual and enhance reflective decision-making.  
  • ‘Second Degree nudges’ which are more serious and typically build on behavioural limitations to bias a decision in the desired direction. iI.e, a default rule with an opt-out. 
  • ‘Third Degree nudges’ involves behavioural manipulation to an extent that other nudges do not. A cigarette pack might show a graphic display of a corpse – a message with an emotional power that aims to block consideration of all options and threatens the agent’s ability to act in accordance with their own preferences. 

She said that once policy makers had decided on their desired outcomes, they needed to think about values in how they pursued them. 

“Be very clear that you are working within pro-social norms, and that you’ve really got the buy-in from a democratic society to go ahead with these kinds of things. It doesn’t necessarily have to be about active transparency, but thinking about what is a pro-social decision,” she said. 

Mr Davies agreed that the ethics around BI were very context dependent. 

“Policies like encouraging seat belt use in car and organ donation are effectively about saving lives but they’re very different types of behaviours. There’s lots of different ethical guardrails that you’d want to think about in the organ donation case you wouldn’t necessarily think about in the seat belt example,” he said. 

Staying curious, testing for outcomes and getting rid of the Sludge

While the use of BI often focuses on consumers or citizens, its lens can be applied to government as a way of reducing the cumulative barriers that stop people interacting with governments. 

Mr Trudinger spoke about ‘sludge’ the combined frictions that get in the way of people getting what they want from government, and steps that could be taken to address it.  

“Human beings tend to add rather than subtract. So, in government if you’re managing a program you think about risk, and it’s human to keep adding little bits of friction because we’re worried that someone will do the wrong thing or they’re not eligible for this or that,” he said. 

“But we forget to put the person and their behaviour at the centre of it. When public servants sit down with people actually trying to use services, they can have this amazing ‘aha moment’ about the frictions all the way through, and maybe start thinking about trading some of this off.” 

He said that anyone working in the BI needed to be curious, and to focus on understanding the context, people and challenges of a situation before deciding on a solution. 

“It is incredibly important to let the data and the wisdom of your front line and the voice of people be driving you towards exploring what might not what may work. 

“In the end we’re only ever as good as the value we keep adding to the public, to citizens and to public policy. 

“As a team that’s focused on experimentation, innovation and unlocking different solutions, one benchmark it’s important that we get judged on is that we fail sometimes, that we do tests and we find that things don’t work.  

“For example, we did some testing on retirement income planning. We thought that if we offered free financial advice to people that would encourage people to take up things like voluntary super. Actually, it backfired for women. Financial advisors appear to be people that aren’t going to be very helpful.” 

Where does AI take BI?

It wouldn’t be a panel in 2025 without some discussion of AI, a technology which has huge potential but raises major ethical questions about how far we would let AI go in manipulating human behaviour.  

Mr Davies said that public policy practitioners needed to be continually thinking about the guardrails around transparency, intent, autonomy that could be put in place around AI. 

“There was a really interesting paper that came out last month in Nature looking at the impact of delegating decisions to AI and the impact specifically on dishonesty. It found that people who delegated their work to AI were much more likely to cheat in a number of different scenarios. And I think that raises a number of concerns for people using AI, not least practitioners that are thinking about AI and what they’re doing,” he said. 

“AI doesn’t necessarily induce the same sense of moral shame or stigma that you might have if you’re talking to another human about potentially technically challenging practice.” 

Dr Ball said the growth of AI made it even more important for practitioners to question the assumptions that they were making about things like cost/benefits, efficiency and making sure that their choices were ethical and in the public interest. 

“This is where I’d like to see it go. We take that curiosity and humble approach we’ve been talking about and really internalise it in the policy space. 

To watch the full recording of the Bridge Live Event: Who decides? The role of ethics in Behavioural Insights – visit The Bridge homepage.  

If you’re not a Bridge subscriber, you can sign up for free here to receive the fortnightly Bridge email connecting you to the latest research that’s relevant to the public sector.

If you are interested in reading further, the panel discussed two books: What We Value: The Neuroscience of Choice and Change by Emily Falk, and How Emotions Are Made: The Secret Life of the Brain by Lisa Feldman Barrett, as giving an underpinning of the psychology behind BI.