The elephant at the IT industry lunch: DTA chief silent on privacy, security


In his first speech, the new DTA chief Gavin Slater talked up the value of digital technologies that cause the biggest privacy fears — digital identity verification, facial recognition and electronic health records — but ignored the p-word entirely.

There was one rather large elephant in the room when Gavin Slater gave his first speech as CEO of the Digital Transformation Agency to an IT industry audience last week: privacy.

Slater did not touch the subject, even when talking up the value of the government digital initiatives that generate the biggest privacy fears: the use of facial recognition, electronic health records, and the DTA’s nascent national digital identity verification system, GovPass.

Slater lauded e-health and the use of facial recognition to speed up immigration processing at airports, along with myGov and myTax, as examples showing the federal government had already created significant value in recent years through digital transformation. “I don’t think we do a good enough job in government of talking about the progress that is actually being made,” he added.

Government agencies could also do a better job of showing — not just telling — the public that they can be trusted to keep sensitive personal information safe. They could be more open about how their systems work and what the risks are and be clearer about exactly what information is held where, how it is kept safe, who it is shared with, what it will be used for and what it won’t be used for.

On the same day as Slater’s speech, the Council of Australian Governments approved the Australian Digital Health Agency’s new digital health strategy, which describes the rebooted plan to create e-health records for every citizen and  switch them off for those who opt out. The original plan, to only create records for those who opted in, was flipped around because it resulted in very low take-up.

Slater said the myHealth record was “a really good and important initiative” that would improve healthcare outcomes, and noted that 10,000 health practitioners were already “signed up, actively using” digital records. The new DTA chief had heard that “around 2-3%” of accidental deaths through misdiagnosis or inappropriate treatment were the result of poor record keeping.

“So, a really outcome-orientated initiative,” he said. “At the moment there’s about five million people who are signed up and are registered to use myHealth, and every 38 seconds someone signs up. So, a great initiative I think, in terms of government demonstrating value to us as citizens, and the economy more generally.”

(The “every 38 seconds” claim comes from the former health minister, who was referring to a four-week period around July last year.)

Slater also enthused that “there will be a time when you won’t have to hand your passport over to go through immigration” as he praised the Department of Immigration and Border Protection’s investment in facial recognition technology. And of course he sees the DTA’s work on a digital identity framework as its most valuable project.

He’s right that all three offer real benefits, but Slater is probably also aware that facial recognition, digital identity and e-health are all quite scary to a lot of people, so it was noticeable that he didn’t mention the importance of robust privacy and security measures at all when speaking primarily to an IT industry audience.

Asked later about the Productivity Commission’s recommendations on data availability and use — stronger consumer rights at the same time as policies to encourage greater usage and sharing of big data for the economic benefits — he confessed he had not read the report but offered his enthusiastic view.

Slater again expressed this entirely in terms of the exciting potential for “creating value” with no reference to privacy risks, consumer rights or the fact that clever new systems based on data analytics are often a bit creepy. He said the way tax forms are now pre-filled with data was a good example of data sharing and added that tax commissioner Chris Jordan planned to enhance compliance operations using digital profiles of taxpayers.

“How [Chris Jordan] wants to use data analytics, is you build a profile on individuals and businesses,” he said, offering an example where the agency goes easy on someone with a squeaky-clean record.

“We all have a bad day, and if someone has religiously submitted their tax returns and paid their fair due and done the right thing [in the past] and they miss one, why go after them with a sledgehammer?”

On the flipside, some people and organisations are marked for greater scrutiny. This is a fine example of using data to target resources at areas of high risk, but Slater did not even touch briefly on the need to show people how it all works.

Harsh critics can become powerful allies

People don’t simply trust their government as a matter of course. That trust needs to be earned in each instance, and winning over some of the loudest and most persistent critics is one of the best ways to be sure of success.

Independent endorsement from this growing section of the commentariat is far more valuable than any public consultation or privacy impact assessment. These stakeholders are the key influencers who need to be convinced, not just with words and commitments, but with hard technical specifications.

The DTA has written generally about the importance of both privacy and security, and staffer Jacob Suidgeest blogged about how GovPass incorporates “privacy by design” in May. He does well to explain the proposal for a “double-blind” system but his message has not cut through to the right people in the public arena.

The former head of the project, Rachel Dixon, understood the uphill battle she faced in selling the project to ordinary consumers and took great pains to explain the considerable privacy features of her prototype at a conference last year. Around the same time, the Australian Privacy Foundation penned a long and critical open letter questioning its digital identity plans.

A high-level source told The Mandarin that the DTA was ready and willing to respond to all of these concerns, but the government refused to allow it to do so publicly, and instead made the odd decision to abruptly postpone the launch of the DTA’s first digital identity framework prototype at the last minute.

The story goes it is likely the APF could have been convinced to soften its position considerably or even support the project, had the government responded publicly, but understandably did not accept private assurances.

The ADHA has also made serious efforts to address the issue but still has a lot of work to do if it wants to convince its harshest critics.

It notes the importance of privacy several times in the digital health strategy and lists “ensuring privacy and security” as one of seven core principals, and even made a promise to make myHealth records “safe, seamless and secure” in the title of the document.

Most importantly the ADHA has established its own e-health Cyber Security Centre to “ensure Australian healthcare is at the cutting edge of international data security” but it still has a lot of ground to make up, and there will be no shortage of people lining up to test this bold claim.

Quite a few independent observers with relevant expertise will remain very sceptical of the myHealth records and the only way for the ADHA to win them over is with concrete actions and detailed information. Its chief executive probably learned valuable lessons from the failure of a similar opt-out digital health system called care.data he led in the UK.

The same goes for GovPass framework and any use of biometric systems like facial recognition. If the government wants public support for these and other digital transformation projects, its best bet is for people like Gavin Slater to engage directly with their strongest critics from the world of consumer advocacy and cyber security.

It may not have been deliberate but by focusing on value creation without mentioning consumer rights and privacy, Slater’s speech reflected the public sector’s key problem. There is a lot of evangelism about the public value of open data, analytics, machine learning, biometrics and so on, but the risks that come with it are generally treated as a side-issue.

The ability of agencies to keep intimate personal information safe is a practical consideration that can be addressed with detailed information about security measures. The goal should be to get independent approval from top academics, high-profile technologists and privacy advocates early on because trying to counter criticism in the media after a decision has been made is likely to be unsuccessful.

Much more difficult to understand and work with are the longstanding fears that exist everywhere about government putting citizens under surveillance, tracking them and storing information about them in big databases.

Both are generally countered with simple assertions that there’s nothing to worry about and the second is often dismissed as the stuff of silly conspiracy theories, especially by public servants.

But these are completely rational fears; people aren’t suffering mass hysteria, they are reacting to what they see and hear. The long series of legislative changes in the post September 11 era have stoked anxieties about how much information is being hoovered up by government, while related fears have grown simultaneously about what commercially and criminally minded actors are doing with data.

There is no shortage of privacy advocates, journalists and technologists sounding the alarm about various government initiatives — the Census boycott campaign is just one of many — and recent reports of data breaches have seriously damaged trust in the ability of agencies in particular to protect data, especially at federal level. Accepting this and trying to get these same influential people on side seems like the only way to turn the situation around.

  • Anon Coward

    There are 4 new successful cyber-intrusions into Australian Government servers EVERY DAY according to this report and the ASD: https://cybersecuritystrategy.dpmc.gov.au/assets/img/PMC-Cyber-Strategy.pdf

    If you like privacy, it is worth phoning up the ATO, and asking them how to opt-out of voice-recognition and how to remove your voice from their records. This is a trick suggestion. You cannot – it’s permanent and indelible – once they have your voice, everything you say for the rest of your live is indelibly attributed to you – no privacy possible.