The volume of data government generates, collects, exchanges and analyses is going through the roof. So what does it take to create a sound governance model that not only meets public expectations but can still glean useful context, intelligence and solid evidence to power effective policy?
On a chilly August day in Canberra in 2015, Jane King the Australian Taxation Office’s then chief information officer (now head of design and change management) delivered a vision bombshell to a room filled with technologists, analysts and senior public servants.
After decades showcasing leaps in data matching, behavioural pattern analysis and advanced risk modelling, King calmly revealed the revenue agency was looking at how to spare everyday people with simple tax affairs, the bother of filing an annual tax return – all thanks to data.
It took just seconds for flurry of tweets and emails to snowball as some looked up to check if what they had heard was right. It was.
“The more we can see of your tax and superannuation business, the lower the touch will be. The less we can see, naturally the higher the touch has to be,” King said at the time, succinctly articulating how a new bargain of trust, transparency and convenience would be forged on the back of sound analytic practice and permissioned, well-governed access to data.
The next day, taxpayers across Australia quietly punched the air. Tax return agents not so much. Neither contested that the analytics economy had arrived and was here to stay.
Automation and the new rules of the road
Three years on, the pace of data-driven change — often through automation — has accelerated rapidly across the economy, government and society. What’s also come sharply into focus is a pressing need for effective governance of data: how it’s gathered, used and retained.
Nowhere has that pressure mounted more than in the public sector, where the consequences of poorly executed automation, underdeveloped data discipline or weaknesses in modelling or projection have burned on talkback radio.
The core of the challenge is the need for excellence in analytics, especially now so many transactions and interactions have migrated online or are digitally enabled.
Done well, analytics gives agencies big and small a powerful view of how to optimise operations, excel at service delivery and anticipate changes and trends across population health, education and environment to revenue, economics and defence. Forecasts are bread and butter for central agencies.
Analytics quality counts. For everything.
What’s less appreciated is that it is the underlying quality of analytics applications – how they’re built and a sophisticated understanding of how to apply capabilities – that makes the real difference. It’s a straight-through transaction versus a bounce to the call centre, a failed payment or renewal.
When strong mandates like the federal and state governments’ transformational push to go ‘digital first’ are factored into the equation, it becomes clear that analytic proficiency and quality are critical not just to future success but survival.
As the analytic imperative gathers pace, the really good news for agencies is that high-grade solutions are more accessible and easier to use than ever thanks to the dedicated and long-term commitment to research and product development by industry partners like SAS.
User driven development
Far from your typical enterprise software provider, SAS has since 1976 specialised in developing best in class products and solutions around statistics and data modelling, management and exploitation with a commitment and clarity of purpose rarely seen among technology providers.
Initially founded to address the lack of statistical computer programs available for academic, government and industrial research, the company has remained strongly user-led in its research and product development goals rather than chasing short term cycles or rapid expansion. Today, SAS still unwaveringly ploughs 24% of revenue back into R&D.
This strong commitment and philosophy has meant agencies ranging from the Australian Bureau of Statistics, Ministry of Social Development (NZ) and Western Australia Police have all for decades relied on SAS’ capability to create knowledge from data and information resources.
But what’s now changing quickly, according to SAS’ Senior Director, Practices for Asia Pacific, Deepak Ramanathan, is the shift to a digital first footing in business and government combined with the onslaught of the Internet of Things is producing volumes of information that will make analytics not only essential but embedded and mainstream in organisations.
Roboglitch: what they didn’t tell you about digital…
Ramanathan is refreshingly candid about what issues public sector leaders need to keep in mind as digitisation takes hold. It’s a much more pragmatic and no-nonsense take than just acknowledging the need for ‘transformation’ too.
As processes, interactions and transactions all digitise he notes careful attention needs to be paid to how automation is utilised, especially automated decisions that may have been manual or executed by a person previously.
“Automated decisions are now coming under scrutiny for good or bad,” Ramanathan says, noting the models which can underpin these often start off in quite rigid environments … even if they are scientific.
What really counts in automated decisions is how well they are modelled and then monitored to pick-up shifting parameters and behaviours among users. Change needs to be baked in.
“Behaviours and intervals tend to change. So it’s incredibly important the model is clever enough to detect [variations in vital points],” Ramanathan says, adding automated models need to be able to flag shifts and “the need to retrain or relearn.”
Getting the real-world picture
Examples of the need for nuance in automated decisioning aren’t hard to imagine. Traffic or transit management systems. Changes in property values that affect welfare eligibility. Or patterns suggesting fraud or abuse.
Ramanathan contends governments in particular now need greater proficiency and discipline around sound data governance principles, similar to science. Especially when the public sector itself is creating so much new information and often digitally exposing previously non-public data sets or entirely new ones.
“I think everybody is going back to the drawing board right,” he says noting regulatory shifts like GDPR (General Data Protection Regulation) have effects much further afield than Europe because of how custody of EU citizen data by entities outside is viewed under the regulations.
Knowing if or how Australia’s data governance interoperates with mandates like GDPR are just part of the puzzle analytics now needs to figure out and embed.
Government, governance and leadership
While the private sector may be constantly buffeted by digital and data driven disruption, Ramanathan sees a positive leadership role for the public sector as an exemplar of good data governance.
Regulation is one thing but the intelligent design of platforms for good and using analytics to create both new opportunities with good guard rails shouldn’t be overlooked as an opportunity.
Rather, agencies and jurisdictions must consider how data is accumulated, created and stored … because there will be more than ever.
“I think what people are really looking for is around the curation, value and validity of data [across] a duration of time,” Ramanathan says, adding that the “shelf life” of a lot of data – especially emanating from devices and applications – think IoT ‑‑ has become “really low”.
Collection for collection’s sake can sometime result in agencies being swamped with data, he cautions.
National consistency: outcomes should drive data
“It’s a question of figuring out, from a government perspective, what is the outcome a department is looking for – and then looking at what the data acquisition strategy should be,” Ramanathan says.
That sometimes means collaboration on taxonomies and standards across both agencies and jurisdictions, no mean feat for a country like Australia with six states, two territories and three tiers of government.
When you consider many of the ‘big ticket’ projects of Australian government today, many are seeking to harmonise, standardise or get a ‘single view’ of a policy area or citizen: NAPLAN for education, digital health and medical records, standard business reporting, property registries and everything else digital government is meant to touch.
This is where there could soon be swift and sweeping changes and a step shift in speed and efficiency.
Lock, stock and Blockchain
Ramanathan sees Blockchain, especially permissioned Blockchains, as a potential cut through technology because of how distributed ledgers permission, transfer and guarantee the provenance of the data they carry.
“For government there’s a massive impact when it comes to Blockchain because it really reduces the cost of data capture,” he says, noting the efficiency comes from getting pertinent data with known provenance and context.
This means linkages can not only be created between agencies, businesses and citizens but that the information on an event or exchange remains consistent.
Australia becoming a leader, not a laggard
Ramanathan says Australia is at the forefront of global Blockchain adoption, citing the Australian Securities Exchange’s announcement to use distributed ledgers for settlements as a case in point.
Like the multiple participants and permissions needed in a market transaction, Ramanathan puts a case forward that a similar unifying technology would benefit government and citizens.
The question is whether agencies are yet prepared for the behavioural change that would come with such a shift after years of doing things a particular way. The shedding of annual tax returns strongly suggests some may be there already
Either way, harnessing analytics within Australian government is rapidly becoming and essential agency capability — one that SAS is qualitatively prepared for when it comes.