Text size: A A A

Emergency measures: Vic audit questions use of rapid response targets

Victorian auditor-general John Doyle has criticised the approaches to measuring, monitoring and reporting response times used by the state’s emergency services in an investigation likely to apply other jurisdictions.

In an audit report tabled last week, Doyle wrote:

“I found that emergency response time measures reported in State Budget Papers and agency annual reports do not provide useful performance information. Targets for the number of minutes to arrive are outdated or not based on evidence. Measures are often narrowly defined and exclude significant proportions of emergency response activity. Data quality is not assured in a number of instances, and Victoria Police does not measure its response times at all.”

The auditor-general adds that public reporting of performance is “limited” with only the firefighters telling Victorians the “actual number of minutes it takes to respond to urgent calls”, and says a focus on providing data for the entire state obscures the variance across its many diverse regions. He goes on to explain that:

“…the limitations of emergency response time measures, targets, data and reporting make it impossible for the public and Parliament to fully and clearly understand response time performance.”

Noting that a holistic view of emergency service performance involves things like the outcomes achieved, cost-effectiveness and community satisfaction levels, not just fast responses, Doyle recommends “urgent review” of both the response time measures that are used and the way they are reported. Agencies were not able to explain satisfactory reasoning behind the response time targets used and in the case of the fire and rescue service, were found to rely on “outdated scientific research” from 1987.

The audit did not consider the performance of the Emergency Services Telecommunications Authority which receives triple-0 calls and directs them to the appropriate agency, and was the subject of a previous audit. Doyle says that in accordance with Department of Treasury and Finance guidelines, performance measures should clearly separate the parts of the process not under the control of the emergency service from those that are.

Doyle also addressed the major variance between regions across a large and diverse state, which is glossed over in statewide figures that by the nature of averages mostly reflect what is happening in the metropolitan areas, where there are far more call-outs.

Statewide averages are reported annually in the Productivity Commission’s Report on Government Services, but provide little useful insight and some critics say the figures are only useful to politicians. Given the unique features of each state — let alone differences in how the figures are measured — there is little use in directly comparing them to each other, either.

According to the former chief executive of the Tasmanian Ambulance Service, Grant Lennox: “If you really want to penetrate response time standards in Victoria, you need to separate all the different parts of Victoria out and report on each of them,” Grant told The Mandarin.

The United Kingdom, he points out, has tried to deal with the issue by creating factors to account for different levels of urbanisation and population density.

“One of the things I suggested about 12 years ago now was we didn’t want to know how Hobart was performing against Sydney, because it’s nothing like Sydney; we wanted to know how we were comparing to Canberra and Geelong,” says Lennox. “So we went to the [Australian] Bureau of Statistics and we got them to give us every population centre in Australia ranked from highest to lowest, cutting off at a thousand, and then we got them to also give us the population density of the city.”

Using the table, the Tasmanian service was able to decide on which cities were most similar to Hobart, although each still has its own geographical challenges.

Lennox agrees that one set of response times does not give the full picture of performance, and explains that internally, most emergency services — fire and ambulance at least — time every step along the way. They also run detailed customer surveys with large sample sizes and running in some cases to about 10 pages, call monitoring and other forms of qualitative evaluation. It’s a complex, ongoing process of continuous improvement in all emergency services that aims to provide management with measures of not only how fast they are performing, but how well.

And there can also be other reasons why faster does not simply equal better. In the case of ambulances, time spent at the scene can also be measured, but depending on the situation and skill level of available paramedics, increasingly complicated procedures are being performed on the roadside. There is also ramping — when ambulances are stuck waiting at hospitals — and other factors involved in the interface between the two separate but intrinsically linked operations that can affect overall performance.

This creates a wealth of data that for internal use but it’s a far more complicated picture than the public or politicians want, and to settle on a simple set of performance indicators that are both meaningful and easy for the layman to grasp is no easy task. The Victorian audit report offers some independent guidance on how to approach it.

Author Bio

Stephen Easton

Stephen Easton is a journalist at The Mandarin based in Canberra. He's previously reported for Canberra CityNews and worked on industry titles for The Intermedia Group.