fbpx
Skip to content

Who decides what works? The use of behavioural insights in Robodebt

29 January 2025

Research

Share

Behavioural insights and the use of nudges have become part and parcel of the policy design toolbox. A key issue is whether nudges could be misused to manipulate citizens. An article in the Australian Journal of Public Administration explores the ethical concerns surrounding behavioural policy design using the example of Robodebt. The paper also discusses the critical steps the Australian public sector can take to ensure future accountability and transparency in policy design and the use of nudge. 

Robodebt and behavioural insights

Robodebt used Australian Tax Office (ATO) data to determine if there were discrepancies between the income declared by a person receiving social services and the income declared to the ATO. It was also automated hence its name. It was a deeply flawed program. Two main concerns included the shift of the burden of proof from the Department of Human Services onto the person with the debt and the use of income averaging. The Robodebt program sent out 800,000 automatically generated debt letters in its first year.  

Robodebt was the subject of multiple Senate Inquiries, court cases, and eventually a Royal Commission. One area that received attention was the use of behavioural insights in the design of letters to those with debt. 

The right way to do behavioural public policy?  

The primary defence for the use of behavioural public policy (BPP) is that policymakers should use the ‘publicity principle’ as a guiding premise for its design. This principle bans government for selecting a policy that it would not be able or willing to defend publicly. Transparency is a common feature of ethical frameworks which explore the use of BPP in government policymaking.  

Unfortunately, transparency is not only challenging to achieve in practice, but a lack of transparency may also be an intrinsic feature of the instruments used in BPP.  There are three degrees of nudging.  

1.First degree interventions as those that do not interfere with autonomous decision-making and may even increase reflection such as the simplifying information and providing reminders. 

2.Second-degree interventions are those that take advantage of a subject’s inattention and inertia, such as using default rules to achieve policy goals.  

3.Third-degree interventions step even further away from conscious decision-making and involve the use of framing devices to highlight specific aspects of the intervention. 

In the case of Robodebt, both the decision to remove the phone number from the letters, and the decision to frame the ATO data as authoritative, operate as third-degree interventions. Even on reflection, a user may not notice that they have been influenced. This was a failure of transparency. 

Preventing harm in behavioural policy

There are multiple ways supporters of BPP claim a behavioural intervention can be determined as suitable. The first, and one of the most popular in Australia, is the use of randomised controlled trials. Trials are purported to minimise the risk of a harmful policy being rolled out more widely and allow for less effective or potentially harmful policy designs to be rejected.  

While there was a pilot phase in Robodebt, there was no rigorous trial or even an evaluation of the pilot program before rollout. The design of the letters at later stages of the program was tested for comprehension. However when comprehension was low, this did not appear to have resulted in any challenge to the policy design. 

In some cases, BPP advocates will recommend the use of qualitative research. Unfortunately, in the case of Robodebt, no qualitative research was done beyond some basic user testing. 

Implications for policymaking

The conflict between responsibility and responsiveness has become a major policymaking challenge. How can the public be confident that nudges will be designed ‘for good’? And who is responsible for this issue? What is the role of the public sector?  

The case of Robodebt is an example of instrumental rationality prevailing over a debate about values. A decision was made to implement a policy that risked harm. Based on the Royal Commission’s extensive hearings, the actors involved appear to have focused predominantly on their responsibility for delivering the policy outcome of budget repair and welfare compliance, as promoted by the Government of the day. 

The Royal Commission has laid bare the fact that policy, behavioural or otherwise, can cause great harm if the public sector focuses on the ends rather than the means. The article argues that the public sector has sought safety by focusing a narrow band of evidence on ‘what works’ rather than providing unwelcome advice, particularly when this advice might point to ethical or moral issues with the preferred policy approach. Robodebt is a striking example of this. 

The bottom line

None of the proposed tools offered by either practitioners or academic research on the ethical use of BPP can prevent the potential, significant harm caused by its unethical use. Focusing on the question of ‘what works’ can lead policymakers astray if the underlying moral positions are not explicit. Ethical and moral issues can be a part of evidence-based and impartial advice. 

BPP can be used to facilitate better policy design and implementation. The issue is that it can also produce negative and harmful effects on the most vulnerable while operating covertly—particularly when questions about social desirability, acceptability, equity, and human rights are not open for discussion. The focus on evidence-based policy which is built on value neutrality can minimise or even completely obscure these questions. 

Want to read more?

Who decides what works? Ethical considerations arising from the Australian Government’s use of behavioural insights and Robodebt – Sarah Ball, Australian Journal of Public Administration,  November 2024 

Each fortnight The Bridge summarises a piece of academic research relevant to public sector managers.  

Sign up to The Bridge 

Published Date: 29 January 2025