Andrew Kibblewhite: the E-I-E-I-O of people-centred policy

By Andrew Kibblewhite

February 22, 2017

As a recent World Bank report noted, the strength of standard economics is that it intentionally simplifies the motivations and actions of human beings. But – that’s also its biggest weakness. By intentionally simplifying things, traditional economics assumes that human beings make rational decisions based on a consideration of all the costs and benefits from a position of long-term self-interest.

New economic thinking increasingly views economies as ‘complex adaptive systems’ and draws on psychology, behavioural science and observation to better understand why individuals behave as they do. This is a significant change from the way we approached policy when I started out on my policy career in the late 80s. As a junior Treasury analyst policy was pretty easy. Have framework; will travel. Assume away the noise.

There were, of course more thoughtful people than me providing excellent advice, but looking back it is remarkable how far you could go with a pretty simplistic, reductionist approach to the world!

Innovating the policy practice

Like economists, policy advisors also need to challenge their traditional practice – including by tapping into methods for generating evidence and insights about human behaviour and experience. We need to reinvent the art of policy making. That doesn’t mean throwing out our old policy toolkit. It means upgrading it by adding new tools and methods that give us richer information about people, their lives, and how they experience and are influenced by government interventions.

As Head of the Policy Profession I want us to be world-leading in the evidence and advice that supports government decision making.

Many of you will be familiar with the policy improvement frameworks recently launched by the New Zealand Prime Minister. The Policy Project facilitated the co-production of these frameworks to help government agencies and policy practitioners improve their craft. I encourage you to check out the tools on our website. The Policy Project has collaborated with the GEN committee in the organisation of this discussion because our work aligns perfectly with the theme of their conference – evidence and insights to innovate the policy practice.

This discussion gives us — whether we are economists, policy advisors or both — a taste of the value of big data, design thinking, and behavioural insights.

I don’t expect we will all become experts. We probably don’t all need to be. But we all need to know enough about what the methods offer, what policy challenges they lend themselves to, when to use them, in what combinations and at what stage of the policy cycle. And we need to know enough to be able to commission work from the deep experts, so we know what good looks like. But most of all we need to be able to translate the results into our policy advice – to turn evidence and insights into actionable advice to government decision makers.

Implications for capability

Policy managers and leaders also need to think about what that implies for capability. What sort of capability do we need in our organisations – broad or deep? And how do we create an authorising environment – permission and culture – to enable staff to innovate and use new methods?

At the system level, leaders like Liz MacPherson [Government Statistician and chief executive of Statistics NZ] and me, in my Head of the Policy Profession role, need to think about where we need some system centres of expertise – pockets of deep expertise that others can draw on.

Avoiding ‘the new black’

What we want to avoid is everyone reinventing the wheel. And more than that, we want to avoid the fetishisation of one method over another – we don’t want to hold out any one approach as ‘the new black’. I’ve seen enough to know that there is space and a need for all of these things in our policy repertoire.

The E – I –E – I – O of Policy

The Policy Project team and I tried to think of a shortcut to describe where we need to expand the policy practice. We need better use of Evidence, Insights, and Evaluation, helping us to make better Investment decisions, and all focused on delivering better Outcomes. It is a credit to the team’s sense of humour, if nothing else, that we came up with E-I-E-I-O. So like an eclectic old farmer of policy practice I’d like to touch on each of those aspects and how they fit into people-centred policy.

Better evidence

E for Evidence. Rodney Hide recently wrote in National Business Review that the “latest fad to infest Wellington is ‘evidence-based’ policymaking”. His essential thesis, as I recall it, was that we are surrendering our responsibility to think analytically about policy problems and instead setting off in pursuit of ad hoc responses to observed problems, without understanding the underlying drivers, or bringing a coherent set of principles to the debate.

Where I agree with Rodney is that data and evidence doesn’t equal policy. Data doesn’t speak for itself. Data won’t tell us what to do – no matter how much we have of it. We need to be able to interrogate the data, and incorporate the evidence into policy advice – advice that gives ministers options backed up by the best available evidence and enables them to make informed choices, including to do nothing.

But having said that I am a big fan of evidence based policy. We are increasingly able to harness the power of different data sets, particularly through the Integrated Data Infrastructure (IDI). That gives us much more information on population groups and patterns – an important policy advantage.

Statistics NZ has a key role to play as a centre of excellence in data and analytics and increasingly taking a leadership role in building data and analytics capability across government.

To provide the full range of options we need to be asking the right questions. We need to be conscious of our own biases. And we need to test them with observations from the real world. The World Bank report I mentioned earlier tells of how World Bank economists – bravely in my view – conducted an experiment to test their own assumptions about people’s incentives and drivers of behaviour. They admitted publicly to a significant blind-spot in the assumptions underpinning their analytical frameworks and operational models. We all need to be open to those ‘inconvenient truths’.

Better use of customer and citizen insights

I for Insights. There will also be times where the evidence is limited. We need to think more broadly about how to generate evidence where we don’t know enough — especially where we don’t know enough about the people we’re designing services for, the people and businesses we’re regulating, the people we want to influence or nudge.

In my policy youth, as a young analyst at the Treasury, there was enormous ambivalence about talking directly to people — to citizens, taxpayers, businesses — at least at my level. We were long on worrying about the risk of testing policies, even in relatively informal ways, and frankly short on understanding of the value that would bring. Policy was fundamentally a desk-based activity.

By contrast, in today’s policy world, understanding the ‘customer’ (however defined) is considered essential. But we’re only just learning how to do it in a disciplined and meaningful way. And we need to make sure we don’t confuse anecdote for insight. That would be lazy policy.

Design-thinking methodologies and tools — like journey mapping, personas, customer cocreation, rapid prototyping — offer some disciplined ways to bring the voices and needs of stakeholders into the policy equation.

At a recent roundtable I hosted on design thinking – Professor Jeanne Liedtka said that design-thinking helps to change our mindset, to challenge our assumptions, to open ourselves to possibilities and opportunities before bringing in the constraints that narrow our ‘solutions’. Someone asked if that needed to be an end-to-end design process or if you could just use bits of the design process. The answer was both.

At a Policy Project session for the Tier 2 Policy Leaders in 2015, we tried out personas as a way to ‘walk in the customers shoes’. It was really powerful – and quite something to see senior policy colleagues expressing themselves forcefully as a grandmother living in a camper van and struggling to access benefits, or as a budding entrepreneur struggling with government imposed red tape!

I recently visited the Auckland co-design lab and saw the insights generated as part of the design challenges facilitated by Jane Strange and Gael Surgenor – for example on the conflicting attitudes to employment held by young job seekers and employers, an ‘attitude gap’ that acts as a barrier to matching up job supply and demand.

I can see how the design process and tools help us to think more empathetically and openly, to bring diverse people and perspectives into the policy process, testing and iterating options to arrive at mutually desirable and viable solutions. I’m keen to see how we might involve decision makers in that process as well – so that we encompass the whole value chain and all stakeholders in designing lasting solutions to human-centred problems that don’t neatly fit our agency or portfolio boundaries.

From behavioural insights we can design policies that are more likely to land well. By testing various interventions to find the ones that work best for influencing behaviour we can nudge people to make better decisions. We’ve seen this used successfully in the tax space – the way letters to taxpayers are worded has a differential impact on tax laggards and relative compliance. MSD has applied the method to finding the best way to encourage job seekers to be successful at seeking and applying for work. Turns out an encouraging text message is more motivating than one that sounds like it comes from the class prefect.

Evaluation and evidence of what works

E is also for evaluation. We need to get better at building knowledge and evidence of what works. That doesn’t mean always having a long and detailed ex-post evaluation of everything. It does mean being clear about our intervention logic and documenting success and failure along the path to implementation. It means being prepared to fail fast and learn from that failure, being agile enough to correct and change tack along the way. That requires good monitoring and an evaluation mindset – knowing what you are trying to achieve, what success would look like, and building the knowledge base. That knowledge base is essential if we want to scale our successes. Building that knowledge base is also part of our stewardship responsibilities – investing in a body of evidence so that we can advise not just the government of the day, but successive governments who may have different policy priorities. It is part of future proofing our advice.

One of the enduring shortcomings of the way we do policy in New Zealand is our tendency to launch initiatives with enthusiasm, then leave them untested. As if existence equals efficacy. The policy profession of the future needs to be much stronger at decisive and impactful evaluation.

The investment approach

Which brings me to I for Investment.

The ‘Investment approach’ uses big data and actuarial calculations to derive evidence and insights about where to target interventions, including with a longer-term picture of the potential impact. The investment approach has huge potential and we are building up experience and knowledge about how to apply an investment approach to different policy areas. Again, we need deep expertise in the system – such as the Social Investment Unit – but also enough knowledge and capability throughout the policy community to commission the analytics and be able to translate the results to policy proposals.

The Ministry of Justice will be sharing their social investment approach at the workshops. And I know that the SIU is developing some guidance to help the rest of us get to grips with the approach and application. In this ‘storming’ phase it’s important that we leverage and share capability and lessons across the system, including confronting some of the challenges like privacy, the risk that segmentation could lead to stigmatisation, and the difference between investing in future assets and reducing future liabilities.

The Treasury’s Living standards framework encourages us to think broadly about investment – about the capital and assets we’re investing in: economic, social, physical and human capital and their inter-dependencies. All of these dimensions are relevant for improving the living standards and wellbeing of current and future New Zealanders.

Bringing it all together – a sophisticated policy toolkit

We need to make the most of methods and approaches you’ve heard about here today. We need to ensure our advice is grounded in as much evidence as possible — both quantitative and qualitative — about real people, the lives they lead, what motivates them and how they experience government.

I’m hoping that this will motivate you to learn more about when and how we can apply a variety of tools and methods to policy challenges. And instead of thinking that one or other of them will be the silver bullet, think about how they form part of a more sophisticated policy toolkit and a multi-disciplinary approach to policy challenges. The proof will be in the pudding. Unless we apply these new methods, and use them to produce actionable policy advice then we will be just playing with methods. Let’s think about how these approaches can be used at each stage of the policy cycle, for example:

  • At problem definition – what story can the combination of approaches tell us?
  • At options analysis – how can the approaches help us to identify what we might do about the problem or opportunity, including some really innovative options?
  • At options testing – how can they help us test our assumptions, try out some options in small safe ways, iron out the kinks and improve them, shed light on what interventions to invest in and ensure those options can be delivered?
  • At framing the advice for decisions – how can they help us translate complex data and analysis into advice that is accessible and gives ministers clarity about what they’re signing up to – and helps them to set their sights beyond short term political cycles?
  • And for monitoring and evaluation – how can they help us inform future advice and decisions based on what works and what doesn’t?

We need to get better at generating and applying information about users, or the people who will be affected by policy, into policy advice, design and delivery – from big data about populations and segments of populations to deep insights about people’s lives and behaviours.

Focus on outcomes

And all the while we need to be focused on the O of my E-I-E-I-O – on Outcomes.

O is for Outcomes. We need to be constantly focused on what we are trying to achieve. Wherever we fit in the policy game – from researchers to deep analysts to advisors and influencers – we are all here to make a difference for New Zealand and New Zealanders. People are at the heart of that – which brings us back to the title of this conference – People and Policy. We need to build our toolkit for designing and delivering people-centred policy.

This article is based on the address by Andrew Kibblewhite to the Government Economic Network annual conference in December 2016.

Learn more about people-centred police from the briefings and toolkits produced by the New Zealand Department of the Prime Minister and Cabinet’s Policy Project.

About the author
Inline Feedbacks
View all comments

The essential resource for effective
public sector professionals