Across Australia, we’re seeing significant digital transformation projects underway… and none more under the microscope than the activities of the Digital Transformation Agency.
But there’s no doubt the newly appointed chief executive Gavin Slater has a massive undertaking.
Keeping pace with innovation, delivering a 360-degree customer view and ensuring data scientists can do a meaningful job within a reasonable timeframe should be the key determinants of any transformation project.
Consider though, that most agencies are still running off 30-year-old systems that aren’t set up for big data manoeuvring. This leaves many in a predicament: driven to take advantage of new innovations but with the impediment of an existing infrastructure that simply prohibits progress.
Underlying the issue is the overwhelming — almost suffocating — amount of historical, uncategorised and unruly data scattered across the organisation in data silos.
Data can be the fuel that powers your innovation-driven growth and digital transformation journey. It has the ability to improve an agency’s agility, speed, performance and provide cost savings.
But as we’ve seen, harnessing data and bringing different data silos together is also the biggest hurdle in large-scale digital transformation projects, especially in the public sector.
We know a thing or two about this. Over 50% of our revenue comes from picking up and completing failed projects that began on traditional relational database technology — including the massive U.S. healthcare.gov or ‘Obamacare’ website.
From our experiences, we have gained deep knowledge into the key drawbacks of existing data management tools, and how you can get around them.
First, how do you know if your data is working against you? Here are the major warning signs:
1. Your data is hard to search and lacks good governance
The diverse data — structured and unstructured — you need to support “next gen” consumer engagement, administrative, financial and clinical solutions is highly fragmented across various systems. You are unsure about the data lineage and who has access to the data.
Consider that 80% of today’s data is unstructured or semi-structured — for example PDFs, online data, audio files and video clips. How can any agency deliver robust services to citizens without having the functionality and ability to recall this information through simple search?
Meanwhile we see huge agencies who want to leverage these data assets. They gather the data, build massive and expensive data lakes — but then can’t use them because the data isn’t governable.
Data governance is more important now than ever because agencies are contemporaneously moving to the cloud. Understandably they want the elasticity and flexibility of the cloud to better leverage their data and to provide potential cost savings. But to do that, they need to trust that the data can be safely shared within the organisation. For if not properly governed, departments will be unable to share data and they’ll lose one of the key cloud benefits.
2. It takes too long to get solutions to market
Uncertain, imperfect and evolving requirements for applications leads to heavy re-work to re-model in order to ensure the new or changed data hasn’t broken your data pipeline upstream or downstream. There are many “point-to-point” interfaces to create and maintain, and your IT team struggles to keep up.
If this sounds all too familiar, it is because it is one of the most common complaints of large-scale digital transformation projects. Marrying the disparate systems requires painstaking work and can often lead to a domino effect where a shift here requires ten shifts elsewhere.
3. Compliance activities strain your resources
Aggregating data for reporting is time-consuming and labour intensive. There is little to no proactive monitoring or alerting of data tied to key performance metrics. Your stakeholders frequently don’t “trust” the data and discussions are more about data accuracy than improving operations and deriving insights.
This goes back to the importance of good data governance. Without it, agencies would be scared to share data, because they might run afoul with compliance issues or unwittingly expose internal information. They are hesitant to share data with data scientists for analytical purposes, because they’re not sure all the personally identifiable information has been scrubbed or redacted from the data. Moreover, if they don’t know the lineage of the data, they can’t be sure of the validity of their analytical results.
The key to all these issues lies in making your data work for you.
Successful digital transformation requires an operational data hub that acts as a virtual filing cabinet to provide a single, unified 360-degree view of all data. This type of hub removes the complex data wrangling and integration challenges caused by traditional relational technology and makes it manageable to discover new relationships, patterns and trends.
So my advice to Gavin is, assess where your data might be working against you, help your teams reframe how they look at data and ensure you have flexible technology that supports it along this challenging journey. This will not only save you from transformation headaches but will put you in a better position to evolve with the changing needs of government services.
Tim Macdermid has over 20 years of experience in IT and is currently the Area Vice President, APAC at MarkLogic.