Big data should be used in context as a complement to other forms of evidence, rather than a replacement, New Zealand’s specialist policy evidence unit wants to remind public servants.
Despite improvements in data collection and analysis in recent years — with the investment approach and better public service results program key drivers of this on the eastern side of the Tasman — simply “sucking data in and pushing it out” is not enough to inform good decision making.
“Data won’t tell you what to do. The challenge is knowing the right time to use the right data, and having the analytical skills and professional judgement to make sense of it,” said public servants at a roundtable event on the use of evidence in policy held by the NZ Department of the Prime Minister and Cabinet’s Social Policy Evaluation and Research Unit, or Superu, and the Victoria University of Wellington School of Government last month.
Although having data can confer a sense of certainty about where things sit, it should be used in tandem with community engagement. “By itself, big data can sometimes be disempowering for those on the ground, if it is not seen to reflect or resonate with the experience of local groups or agencies,” notes the outcome document from the discussion between senior bureaucrats and experts, which included Dr Sarah Morton and NZ’s top public servant, Andrew Kibblewhite.
Big data isn’t necessarily about centralised decision making, either.
“It’s also about getting data out to decision makers on the ground and about front line operational staff using ICT tools (e.g. smart phones/tablets) to use data modelling to support their decisions on the ground. This also saves on paper work and administrative costs by making data more accessible and useable. They can also help generate data,” says the report.
The roundtable is part of a broader push to improve the use of evidence in policy making.
Superu has already produced an evidence rating scale against which evidence for effectiveness of social sector policies, programmes, services or practices can be assessed, and maintains a government outcomes catalogue tool to keep track of priority programs. There is also The Hub, a one-stop-shop for New Zealand government social science research related to education, health and wellbeing, crime and justice, families, children and young people.
“The Public Service needs to invest in the capability to look beyond current government priorities. We need to help ministers see the value of stewardship and looking beyond ‘in the moment’ decisions to think about the longer-term,” Kibblewhite thinks.
“For this we need to keep a learning mind-set, moving beyond enthusiastically launching new initiatives to also critiquing the results of past initiatives and thoughtfully debating these with ministers, advisors and the public.”
Evidence to support decision makers
A shift in mindset is needed from just synthesising, spreading and using evidence to ensuring the best evidence is used for each decision, argued Morton.
To enable the best evidence to be used at the right time, staff need to ensure that decision-makers:
- have more time to reflect on evidence, value it, and demand it;
- understand current practice (evaluation and research);
- understand community and citizen needs.
This requires practitioners to:
- match synthesis to the specific decisions needed (and the reasons why);
- map the evidence landscape and identify the gaps (e.g. in which domains do we know a lot, or a little?);
- ensure the landscape incudes terrain that officials and citizens are interested in;
- consider what each deliberative decision making process should look like;
- link to relevant research entities and programmes.
Building the knowledge base
To be able to use evidence effectively, policy makers need to understand the different sources of evidence and how they come together.
“What people consider to be ‘knowledge’ depends on their relationships and contexts. Politicians have good access to stories from constituents, which form part of their knowledge base,” says the report.
“Officials also hear from citizens, and have access to administrative data, evidence and insights. The challenge is to synthesise and make sense of the diverse sources of information, and ensure that it reflects and resonates with those diverse perspectives.”
Importantly, government should engage both those who are satisfied and those who are unsatisfied, as well as communities “far from the norm”.
New Zealand DPMC’s policy quality framework emphasises the need for advice to include evidence and insights from diverse perspectives, and its cabinet manual has just been revised to strengthen expectations for officials to consult diverse sources in policy development.
But if the knowledge base just isn’t there, research or testing may need to be done — again with input from different people.
“We know innovation is not well embedded so just start doing it, while building capability. Both those commissioning for outcomes and being commissioned need to collectively ask: What is the problem? What works or might work? How might we know?”
Building the authorising environment
Trying new things of course means they might not work out: this requires a certain amount of permission to fail, as well as an ability to share lessons learned when that happens.
A hostile media environment, excessive risk aversion and inflexible performance management systems can all make this difficult.
Building in and funding evaluation of new ventures as a normal part of business ensures evidence is built up as a matter of course.
Government staff should be thoughtful about transparency around innovation in the public domain, and seek to gain small permissions while preparing the ground for an experimental approach, the document recommends. “Think about different types of risks and how they can be managed.”
Building human and relationship capital
Knowing which information to use and where to find it means investing in skills and understanding — and this doesn’t just mean staff.
The collection and analysis of evidence about policy is ongoing, not a singular event. This requires strong relationships with a range of actors.
Relationships can help spread insights to different contexts. To scale up or spread what works, in the end, we can only scale up processes, and these need to be adapted to specific contexts through skilful engagement,” the paper notes.
“It is powerful to build ‘evidence generating capability’ with those who are normally ‘subjects’ or users of research or evaluation.”