Peter Harris: rules forcing data destruction 'akin to burning books'

By Harley Dennett

March 23, 2017

The Productivity Commission is in the final week of its inquiry into data availability and use. Commission chair Peter Harris says that in public policy, while there’s lots of opportunity there’s not really much action. And what action there is remains seriously uncoordinated.

Yesterday, Harris addressed CEDA on the progress of his inquiry. It was an edifying and illuminating speech, reflecting on the myths and contradictions he discovered both in the public sector and the wider community, and I encourage you to read it all.

Data policy “sounds pretty dull to most of us”, Harris says, but can induce high passions. The PC chair hasn’t held back his own judgements, with proclamations — to be followed by more detail when the report comes out — about how sorry a state current data policy is in the Commonwealth, especially with respect to its own data:

“Failure to use data in a setting policy, failure to understand what data is telling you, and most ugly of all — the failure to be allowed to even glimpse a data set we know exists.”

Harris also raises concern about how badly policymakers have let down the “unheralded public service community” who pull together the wisdom behind essential evaluation like RoGS — the Review of Government Services — only to have their linkages destroyed immediately after they’re created.

Having now consulted with companies that want to turn public data into their own proprietary assets, and consumers who want, want, want, but are unwilling to share themselves, Harris says it’s time to face up to a truth: that essential services would collapse without access to data that many think should be, or already is, private and untouchable.

Nothing in data use is risk-free, Harris says, but it’s clear he’s thinking a lot about how to incentivise the building of trust, not just formal compliance.

The final report of the PC’s data available ability and use inquiry is due this month.

Below is an excerpt from yesterday’s address:

Harris: disgraceful results stem from rules out of time

In reports throughout the 2000s, the Productivity Commission lamented the restrictions on access to data that the public sector held, but which was off limits to researchers and analysts – those outside the organisation and as often as not from inside as well.

Even when we could find a data source to unlock a policy conundrum, those sources that could answer vital questions often had not been linked. At times, this was due to indifference. At times, privacy rules were asserted. And at times, laws prevented it.

Where we could, we became quite skilled at doing that linkage ourselves, or had others to help. But then we had to destroy the information generated, in support of confidentiality or privacy requirements. This is a rule, set by Commonwealth Government policy.

It is akin to burning books.

In 2013, my first year at the Commission, frustration at the breadth of lost opportunities led us to write a chapter in our Annual Report dedicated to describing the poor performance of Australia in using its administrative data: the stuff collected by way of compliance or payment of benefits across State and Commonwealth governments.

This is one area where, although no country claims to be on top of the data game, we were and remain clearly behind better practice amongst our peers.

From our own work, we know how use of public sector data can be done much better.

With the active co-operation of an unheralded public service community, we have produced for more than twenty years one of the best examples of the use of data for performance review across governments – the Review of Government Services, covering roughly 12,000 data points drawn from data sets around the nation.

The sets behind these data points vary in quality and reliability but they often share a common heritage: you can’t link them to do program evaluation.

Instead, and to their credit, the Commonwealth, State and Territory team that maintains the commitment even today – many years after COAG stopped producing agreements like this – relies on the media publishing the results.

And hopefully in the case of underperformance, nature then takes it course.

We know too that in health care, despite earnest efforts by an array of individuals, a combination of intellectual property restrictions; duplication and risk aversion by ethics committees; and legislation devised for a different purpose in a long-past era, locks some of our most valuable data up.

In our report, you can read how hospitals are required to sign up to intellectual property restrictions that prevent data transfer between wards. Or how cancer researchers use foreign data sets because our local ones are more restricted. Or how a nationally-funded research project into vaccination is nearly 7 years into a saga to be allowed access to Commonwealth and States’ data sets. It expects to be finally allowed full access in another year or so.

These are pretty disgraceful events. They are the tip of the iceberg.

Thus while it is obvious to anyone who searches via Google or has used uber or understands the ability of IBM’s Watson to find forgotten clinical evidence of disease response in medical publications going back decades that big data is driving rapid private sector adaptation and investment over the last decade or so, we in the public sector remain at best relying on small, localised and most often personally-driven efforts at data sharing and deeper analytics.

As innovators, we in the public sector have been poorly served by our current regulation and practice impeding data analytics.

About the author
Inline Feedbacks
View all comments

The essential resource for effective
public sector professionals