In the United States in 2006, two former Google employees formed a company, Weatherbill, to provide weather insurance to ski resorts, large event venues and farmers. In 2010, Weatherbill decided to focus exclusively on agriculture, launching the Total Weather Insurance Product.
The move prompted Greentech Media to ask: “Is it the Riskiest Startup Ever?” It explained:
“The company’s algorithms continually examine streams of incoming weather data, past history, local conditions and other factors to devise a personalised insurance policy for a farmer.”
What the article didn’t say was that Weatherbill, by then renamed The Climate Corporation, was basing its premiums on a huge trove of data supplied by the US government: 60 years of crop yield data, 14 terabytes of soil data and information from a government network of 159 weather radar stations. The result was that the company did very well: it was bought by global behemoth Monsanto for $1.49 billion in 2013.
Such can be the value of government data. In New Zealand, the Ministry for Social Development generated a $954 million benefit by linking different datasets and analysing these to discover important, and previously hidden, patterns and linkages.
In 2010 the agency was spending $21 billion a year providing services to more than a million New Zealanders, 13% of the working population was on welfare, and many had been so for more than a decade. By matching and analysing data across several government agencies, the ministry discovered that more than 70% of welfare expenditure was going to people who had entered the welfare system before they turned 20.
It was able to predict the probability of these people becoming benefit recipients as adults and offer targeted services to reduce their long-term benefit dependency. As a result employment rose 9.3% in 2013, benefit payments fell to a five-year low and the department expects to save $954 million over four years.
Both these examples of the power of data analytics in government were cited in the December 2015 report Public Sector Data Management. It had been initiated in April 2015 by the secretary of the Department of Prime Minister and Cabinet to examine how public sector data could be better used to achieve efficiencies for government, enable better service delivery and inform policy development, while innovating and creating new products and business models.
The ‘disruptive monetisation’ of data
According to Marek Rucinski, managing director of Accenture Digital – Analytics, the New Zealand case of combining data from disparate sources to create value epitomises the highest level of data monetisation: disruptive monetisation. But bringing together data from different sources in government can be challenging.
“What we saw from that paper is that internal barriers to sharing data between agencies are common. There was no strong culture among agencies to publish and share data, and there is not enough incentive to move agencies up the learning curve,” he said.“… internal barriers to sharing data between agencies are common.”
“These for me are the clear pressure points that require additional thinking in regard to where is the value and how the use of data and insights can change the game for the agencies and the way they relate to the industries they operate in and potentially how they create value for the citizen.”
Rucinski identifies four levels of data monetisation: internal monetisation, allied monetisation, demand-driven monetisation and disruptive.
Internal monetisation he defines as “the creation of value from data within the four walls of the agency or the organisation”. Allied monetisation is “where you create relationships with partners that are in the same value chain, that potentially can derive value from the data or that can provide data to the agency that can add value”.
Demand-drive monetisation takes this one stage further by government inviting external parties to come with use cases for government data and then entering into a dialogue to create a mutually beneficial alliance.
What Australian can learn from others
He says there are clear lessons for Australian governments in overseas examples where a dialogue is created with private sector organisations “to allow purposeful innovation with clear value”. And, he added: “It can be done in very respectful manner with regards to the privacy and sensitivity of data.”
The Australian Tax Office already makes extensive use of data analytics and is planning to significantly increase this. The ATO has power to obtain data from insurance agencies; this helps it to more accurately estimate wealth and target their investigations by identifying valuable assets of taxpayers.
It can also access the details of every international funds transfer involving Australians. In 2014 it staged Project Do It, giving Australians the opportunity to declare undisclosed or incorrectly reported offshore financial activities.
In 2013 the ATO was chosen to head a new Data Analytics Centre of Excellence, a move designed to improve the sharing of information between government agencies. It has expressed great confidence in the power of analytics to catch tax cheats, saying in its 2014-2018 corporate plan:
“We will better use technology and data analytics to identify and deal with those seeking not to comply. For most non-compliant taxpayers, it will be a matter of ‘when’ we catch up with them, not ‘if’.”
The Department of Immigration and Border Protection says it will use real-time data fusion and analytics to enable it to “concentrate our efforts and judgement where we can make the most difference”.