Skip to content

ANZSOG Case Library: Governing by Algorithm? Child Protection in Aotearoa New Zealand

27 February 2022

News and media


Image of children running on a grassy field

We might trust algorithms to select our next video on YouTube, but what happens when governments plan to use them to make decisions that can change lives?

A new case in ANZSOG’s John L. Alford Case Library looks at the political fallout that occurred in Aotearoa New Zealand when an algorithm left the confines of academia to be applied to the complicated realities of child protection.

In 2014, the new Minister for Social Development in Aotearoa New Zealand, Anne Tolley, was presented with a briefing paper from the National Children’s Director at the Ministry for Social Development (MSD) titled ‘Vulnerable Children Predictive Modelling: Design for Testing and Trialling.’

The briefing paper included a proposal for a two-year observational study ‘to assess whether children identified by the Predictive Modelling as ‘at high risk of an adverse outcome/s did in fact suffer that outcome’. Though it was not the MSD’s intent, Tolley viewed the ‘observational study’ along with an additional comment that ‘the PM score would be calculated at birth for a known cohort of children and then these children’s outcomes and service contacts observed’ as MSD suggesting a suspension of child protective services as a means to test the accuracy of the Predictive Risk Model (PRM).

Consequently, Tolley called for the PRM’s implementation to stop immediately, after which she released the proposal, along with her annotations, to the media. The resulting coverage sparked debate on the ethics of using the tool and whether it had social license for its implementation.

This case outlines the controversy and the public sector management issues that emerged: including agencies’ capacities; change management requirements; and the timing and form of the required consultative processes both within and outside of government.

The history of the development of this tool began with the pioneering work of Emil Putnam-Hornstein in California, who developed a predictive model of children at risk using a handful of variables. The Aotearoa New Zealand version began in 2012, when a team led by Associate Professor of Economics at the University of Auckland, Rhema Vaithianathan, published a paper about developing a predictive tool drawing upon historical data sets from multiple government databases of New Zealand children

The use of data and actuarial methods by social workers to augment their professional judgements was well-established and accepted. However, the PRM was different. It sought to rate children across the whole cohort of families accessing social welfare benefits with the aim of identifying those where children were at greatest risk of abuse and neglect. This included rating children in families with no prior history of child maltreatment and where there was no indication that harm might occur – based on 224 variables including demographics of the caregiver and the caregiver’s partner such as the proportion of time the care-giver had spent on unemployment benefit in the last two years; court-issued reports for the other children of the care-giver; criminal record of the care-giver and the care-giver’s partner; and youth justice reports for the partner.

To test its predictive capabilities, the PRM was trialled in 2012, aiming to test its accuracy in determining which children under age two, would suffer substantiated maltreatment before they turned five. The study reported predictive accuracy as ‘fair, approaching good, strength in predicting maltreatment by age five.

As well as her concerns about using children as ‘lab rats’, Tolley saw the tool as a top-down bureaucratic approach that avoided the need to work with communities to identify and support families, and her own preference towards consultation and local solutions.

Brendan Boyle, chief executive of the MSD, also commented that a cutting edge tool like the algorithm could not just be imposed over the top of existing practice without integrating into other work, and bringing professionals in the area on board.

Ultimately, the political reaction sparked by Tolley’s intervention spelt the death of the use of the algorithm, at least for Aotearoa New Zealand. It was replaced by a shift to multi-agency Social Service Teams made up of officials from the Ministries of Social Development, Education and Justice. Thus, the Government sought to improve the efficacy and efficiency of intervening by drawing together information from across government agencies, the very same objective which under-pinned the design of the algorithm.

Tim Dare, the ethicist who prepared the original ethical review of the tool, did not accept the Minister’s view that the children would have been left unprotected. He condemned what he saw as Tolley’s use of ‘inflammatory rhetoric’, Dare concluded that ‘science collided with politics, and politics won’.

The Case explores the debate around the algorithm and the broader issues of child protection as well as the unease around delegating vital decisions to technology.

Follow @ANZSOG