Government agencies have a lengthy heritage of collecting the data they need to plan and deliver services. The obvious outcome of course is that multiple data stores have proliferated across agencies, or even within the same agencies, with each describing different attributes of the same thing. Everyone can see parts of the picture, but no one can see the whole.
This challenge of data fragmentation and the problems it causes has led many agency leaders to seek a single view of data, but achieving this has often proven elusive.
How an agency might achieve a single view of data was the topic of a recent online roundtable hosted by The Mandarin and sponsored by Capgemini and Informatica, where industry and agency leaders discussed their data management challenges and the strategies that could deliver a single view of data.
The Senior Director for Public Services at Capgemini, Andrew Slapp, opened the conversation by describing the negative outcomes for individuals that flowed from when a single view of data was not available.
“We’ve all had a really bad service experience, and they are quite memorable,” Slapp said. “It often seems that organisations don’t know you – even though you have told them.”
This problem is often expressed as a ‘tell me once’ requirement. Slapp said that although it appeared from the outside to be a basic concept, was actually a highly complex challenge to solve, and required the ability to unlock the data tied up in legacy systems and create processes that would allow data to traverse different parts of the organisation.
“If we really want to live up to that service experience, it is critical that we bring insight and data to all the conversations we have with our customers, and deliver on that ‘know-me’ principle,” Slapp said.
“Data is key for helping to optimise and tailor the service experience. That experience expectation is now ingrained into expectations, not only within the private sector, but also within government and education sectors.”
Attendees agreed that the problems of data fragmentation were commonplace, and were made worse by some processes still recording data using pen and paper, hence introducing opportunities for errors to be made.
According to Informatica’s Head of Master Data Management for APJ, Joseph Sullivan, the problem of data fragmentation could not be solved by:
Neither addressed the realities of agency structures or solved data quality issues.
“You need to create a trusted view that reflects what the business and the users of those systems typically know, and we want to bring that together to know we have this best view of the information,” Sullivan said. “We need to be able to cleanse data, we need to be able to enrich it sometimes with third-party sources, and then we can match it together.
“And if you absolutely can’t have a mistake ever, then you have to build that tolerance into the way you bring that data together.”
Capgemini’s Senior Director Insights & Data, Troy Wuttke, said that agencies that succeeded in data matching programs that delivered better experiences were usually rewarded.
“It’s about making sure that once a record has been received, that the recipient actually has a mutual customer, that you can use that data with high confidence with traceability,” Wuttke said.
Inevitably technology played a vital role in translating this vision into a reality. However, Capgemini’s Senior Manager for Insights and Data, Anindya Jena, said the successful projects tended to be the ones that treated single-view-of-data as a business problem rather than as a technology or a tool problem.
“But having said that, there are still some critical tools or technologies that are needed, because they are the enablers,” Jena said. “The importance of master data management (MDM) in the context of a digital transformation cannot be overemphasised. And we should consider the MDM as the enabler of new government initiatives which facilitate greater agility, cost savings, and definitely better services.”