What if the crowd forecast the economy for Treasury?


Treasury building

We shoot the breeze about who’ll win the next election or footy match. Virtually none of it helps predict the future. But we’re driven on … as if somehow it will.

We do it with the economy. People ask economists how they see the future and most role-play the expert when the honest, indeed expert answer would usually be the answer Treasurer John Kerin gave to journalists when they asked when economic recovery would take hold. “Your guess is as good as mine”. Still, refusing to play the role is no way to hang onto it. Kerin was relieved of his duties as Treasurer shortly after this outbreak of candour.

Even so, myriad decisions are predicated on specific views of the likely future. And skilful effort can improve economic foresight. If only a little. That’s why Treasury asked Warren Tease, veteran econocrat fresh from a lengthy mid-career in the private sector, and now principal adviser in Treasury, to report on its forecasting.

His report found Treasury’s econometric models were “rigorous, theoretically sound and fit history reasonably well” and suggested some minor improvements. But it was strangely oblivious to the new possibilities of the internet age — the prospect of using real time data and the slew of new opportunities to crowdsource critique and further improvements.

” … there’d be no gotcha moments once in the hands of independent agencies.”

Treasury was once a leader in open government being the first Treasury globally to release its budget licenced Creative Commons. It has also released documentation behind several of its models. But here, as in so many other parts of the open government agenda, progress has stalled. Treasury pointedly declined to release its revenue forecasting models more widely – even on receiving a request from the Parliamentary Budget Office in violation of the OECD principles it had recently participated in drafting.

Except in the rare cases where there are strong policy reasons against it, Treasury and other agencies should release their forecasting models, fully specified and ready to operate. How would we benefit — let’s count the ways:

  1. The structure and parameterisation of those models would be up for auditing and critique.
  2. As with open source production in Linux or on Wikipedia, some critics will develop and improve the models.
  3. To quote Australia’s 2010 Freedom of Information legislation, those models are a “national resource” that should be “managed for public purposes”.
  4. It’s fair. Why should publicly funded assets not be available to the public or even its representatives?
  5. It’s more efficient. Whilst improving government agencies’ modelling and accountability, releasing the “national resource” of its models would then be available to others for their own forecasting or educational purposes.
  6. After ignoring it for years, Australia now wants to join the international Open Government Partnership yet it has the millstone of its predecessor’s unfunding of the Australian Information Commissioner and without much to demonstrate that it’s serious.

Take pride in robust evolution

What we’ve proposed thus far is really only the first step towards modern, internet-enabled public sector forecasting. Taking Treasury as an example, over time it could more tightly focus its role around the process of funding and releasing important new models or improvements to them, convening the community of practice around them and offering its own official best-estimated specification and parameterisation of the models when generating official forecasts.

It could make ad hoc changes to the final forecasts reflecting its own judgment just as Tease suggested, though this would be publicly documented and so, publicly scrutinised.

” … award modest prizes for the best contributions to the model — it would improve the whole intellectual environment … ”

Ultimately, as Shadow Treasurer Chris Bowen has argued, this process should be handed over to an independent body like the Parliamentary Budget Office as already occurs in the UK which on this, as in many other areas has surged ahead of us on governance and innovation in the internet age.

The government’s official forecasters could also help promote our collective interest in better modelling and better economic education by awarding modest prizes for most accurate modelling, best contributions to the modelling debate and the best contributions to the model. This wouldn’t just improve our forecasts (modestly). It would improve the whole intellectual environment in which they operate not least by accelerating the rate at which we discover, motivate and make better use of the best talent.

The Tease report was also surprisingly silent about bringing forecasting into the age of big data. Half a paragraph of text and a single line recommendation supported catching up to other central agencies like the Bank of England who’ve pioneered “nowcasting” — using specific modelling techniques and data to better estimate the starting point for forecasts while forecasters wait for official (often many month’s retrospective) data to be updated.

By contrast our own Reserve Bank mistimes its deliberations on its cash rate with exquisite precision holding its board meetings the very day before the national accounts are released. To do so once a year would seem like a misfortune. To do so on every occasion it gets — year in year out — suggests carelessness of an unusually dogged kind.

Tease made no mention of accessing the increasingly powerful real time accounting data from large firms and the providers of online accounting like Xero and MYOB online by which we could nowcast much of the economy. It ought to enable the spotting of turning points several months sooner — which Lateral Economics has elsewhere suggested could be worth many billions to the economy in better-timed macro-economic decision making. Certainly hedge funds are obsessed with accessing such data. How hard have we tried to get hold of some of that data — if necessary in anonymised form? And how vigorously is the ABS trying to move towards more harvesting of real time data to supplement and, where appropriate displace its 20th century model of surveys.

Models wars and gotcha moments

Finally, wouldn’t all this openness just spur endless model wars and gotcha moments with the noise drowning out the signal? In fact gotcha moments abound today whenever forecasts are released and whenever they go wrong. And yes, there would be some model wars to begin with, but the public at large would largely tune out fairly quickly while the community of practice around the relevant forecasts would come to more balanced conclusions on where the merits of the arguments lay. And there’d be no gotcha moments once the models were in the hands of independent agencies (Playing gotcha with Treasury forecasts is usually playing gotcha with the Treasurer. Who plays gotcha with the Reserve Bank’s forecasts?).

When he came to power our new Prime Minister told us he’d preside over a government that walked the talk of innovation. There could be no better way, and no better time for the government to show us just what that meant.

Postscript: At the time this article was due for publication the RBA published a review of its own forecasting by Adrian Pagan and David Wilcox. Without commenting on the nature of the forecasting advice, we note the report’s reference to the “case for a greater involvement with the wider economic community”, its reference to the optimally mistimed RBA board meetings and its call to “aggressively explore” the use of real-time big data to improve the accuracy and timeliness of the economic data used.

About the author
Premium

The essential resource for effective public sector leaders