Better use of evidence is central to the government’s aim of reforming the Australian Public Service (APS) to deliver better outcomes for people. A recent ANZSOG panel has explored how the APS reform agenda needs to change the culture of the APS to understand, interrogate and analyse more carefully what is known, and what is not known, in policy development, program implementation and service delivery. Public servants need to be innovators in evaluation and evidence-based policy, while navigating the conventions and responsibilities of working in government.
The panel, which was hosted in association with the APS Reform Office, brought together experts in the use of evidence for government. The lead speaker Professor David Halpern, CEO of the Behavioural Insights Team and former What Works National Advisor to the UK Government, was joined by Dr Rachel Bacon, Deputy Secretary for Public Sector Reform, Department of the Prime Minister and Cabinet, and Dr Mark Cully, First Assistant Secretary, Macroeconomic and Policy Division, Treasury. The session was chaired by Dr Subho Banerjee, Deputy CEO Research and Advisory, ANZSOG.
The use of evidence is critical to efficient, effective and equitable government decision-making. To achieve this requires an evidence ecosystem which supplies public servants with research evidence, develops the research capabilities to ask and answer the kinds of questions that are relevant to policymaking, and puts in place the mechanisms to implement findings.
With the APS Reform agenda seeking to build this ecosystem and ensure that the use of evidence is deeply embedded within APS Craft, Professor Halpern reflected on his experiences with What Works and how lessons might be applied to the Australian context.
Since 2013, the What Works network has aimed to improve the way the public sector creates, shares, and uses evidence in frontline practice to support the design and delivery of government policies, programs, and services.
Central to the What Works approach, Professor Halpern explained, is experimentation and evaluation to answer the question ‘what works’ in a given public policy area. Experimentation and evaluation generate better evidence, which is then translated and distributed among government and other practitioners, and then applied to practice.
For example, in the area of schools policy, they were able to look at the benefits of different policy interventions compared to their costs, as well as be open about how confident they were in that judgement. This meant they could rapidly identify some things that worked and some that did not – such as having students repeat a year.
He said that the UK’s National Audit Office analysed $400 billion of new expenditure across government to find what proportion had something akin to even the most basic evaluation.
“The answer was only eight per cent. This is a really powerful stat and a glimpse into the depths of our ignorance,” he said.
Part of the goal is to create evidence-based professionals, such as police, doctors and teachers on the frontline of the public service because “‘even if you produce the evidence, if your audience of professionals can’t tell the difference between good and bad evidence then it is all for nothing”.
This approach necessitates a public service that acknowledges what is not known and is empowered to innovate, test ideas, and ask what works and why. Drawing on an analogy of a map with blank spaces signifying unknown areas still yet to be explored, Professor Halpern called for a public service that was “bold enough to produce maps … which have got the blanks on them which we then want to fill up.”
The panelists agreed that the APS reform agenda provides the operating environment for this boldness, with ministers and public service leaders who are calling for more evaluations and are keenly interested in seeing how the system coalesces around results. “This kind of authorising environment” Dr Bacon observed, “doesn’t come along often.”
There are a range of initiatives to support the reform agenda and the better use of evidence, including the work of the Office of National Data Commissioner to publish and share data.
As the APS develops an evidence ecosystem, Dr Cully observed that it will have to contend with building core evaluation and experimentation skills, establishing an evidence pipeline and navigating the different timeframes for different analytics, and risk aversion and disappointment in the face of evidence.
Despite their complexity, these challenges must be addressed. The use of evidence to inform better services, policies and programs is fundamentally related to what governments exist to do. As Dr Bacon observed, “unless we can understand cause and effect, unless we have the data and evidence to hone the levers that we use in government to have a better effect with the investments that we’re making, we’re not actually doing the right thing by the communities that we serve.”
Dr Cully said that evidence was not being used properly in Australia to support good policy.
“My sense is that, contrary to what you might hope, the further up people get in the public sector the more they will back themselves on priors and on frameworks and come to a policy position very quickly and then look for evidence that supports it, rather than trying to interrogate evidence that informs a policy position”.
Professor Halpern said policy designers and public servants in general needed to make evaluation part of their work, and design their policies or programs so they could be evaluated, because “evaluation cannot always be done retrospectively without a monumental effort”.
He said that part of public sector reform was creating an environment where public servants built evidence-based innovation into their daily work.
“One of the cool things about this stuff is that it is not just what do in Treasury about ‘is this program effective?’, but that every single public servant has in their hands the ability to do something slightly differently, and they can be these innovators and entrepreneurs we can back and support. That’s an amazingly empowering thing to see. It’s not an elite activity, we should be building a public service where we can all be innovators.”
Dr Banerjee concluded the session by encouraging attendees to consider ways in which they might be able to encourage early consideration of evaluation in policy development and design, and more robust analysis of evidence to inform advice to government, in their everyday work.
You can listen to an audio recording of the session here: