Text size: A A A

AI and big data will reshape your job, but not as fast as you think

We’re constantly told artificial intelligence and big data will change our jobs, but history shows we shouldn’t be surprised if the benefits are still many years off.

Prime Minister Malcolm Turnbull is famously enthusiastic about innovation.

“The pace of change, supercharged by new and emerging technologies, has never been so great, nor so disruptive,” Turnbull said back in 2015.

Of course, he’s not the only one who thinks so.

“Innovation is moving at a scarily fast pace”, says Microsoft co-founder Bill Gates.

It’s common to hear — especially from Silicon Valley and tech consulting types — that new technologies like AI and big data will fundamentally alter our workplaces and societies, doing for the 21st century what electricity or the personal computer did for earlier periods.

AI’s broad range of potential uses and ability to improve itself suggest it will indeed have far-reaching impacts across many industries. Algorithms are now as good as humans at recognising faces. Automated vehicles may one day be good enough at identifying hazards that they will be safer drivers than people. A Google DeepMind algorithm worked out how to reduce the amount of energy needed for cooling the company’s data centres by 40%, potentially saving hundreds of millions of dollars a year. And these are just a handful of the potential applications of AI.

The federal government’s recent announcement of a five-year, $1 billion deal with IBM to increase public sector use of blockchain, AI and quantum computing suggests the government thinks there’s plenty of potential there too.

“The impact of these technologies in the workplace so far has probably been marginal, or even imperceptible, despite the enthusiasm of the techno-optimists.”

But for most employees, the impact of these technologies in the workplace so far has probably been marginal, or even imperceptible, despite the enthusiasm of the techno-optimists. For many, this is reason to think they’re more toys for nerds than forces that will reshape the economy.

Indeed, it may be some years yet before the significant potential of such ideas is embedded in the work of those outside a narrow set of specialists. And for all the talk of innovation touching off an age of increased prosperity, multi-factor productivity growth has been relatively low and even negative across most developed countries in recent years, including Australia.

This shouldn’t be surprising, however. As scientist and futurist Roy Amara once phrased it, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”

On the one hand, the boosterism around AI sometimes makes it sound like we will soon be living in a sci-fi film. On the other, predictions often misidentify where the biggest impacts will be in the long-term — and frequently are little more than linear projections based on recent history (which is why old drawings of the future look so quaint).

Nevertheless, both are usually wrong.

The productivity paradox

Economists noticed back in the 1980s that although American companies were making huge investments in computers, it didn’t seem to be improving productivity levels — and may have even been harming them.

As Nobel winner Robert Solow archly put it: “We see transformative new technologies everywhere but in the productivity statistics.”

While other problems like increasing oil prices weren’t helping, many think it took many years after the initial introduction of computers for organisations and individuals to adapt business processes, train staff and invent the many smaller innovations needed to make the new technology work in a practical setting. In the meantime, all this adaptation sidetracked people from their day-to-day jobs, leading to a temporary decline in productivity.

These lags can take a very long time to resolve — it wasn’t until the 1990s that multi-factor productivity picked up. Indeed, while computers have become a common workplace tool and made most of us more productive, even today there are people working in professional jobs who can barely operate a PC.

The MIT economist who popularised the so-called “productivity paradox”, Erik Brynjolfsson, recently paired up with MIT colleague Daniel Rock and Chicago’s Chad Syverson to argue that the same thing appears to be happening with AI. They explain just how unexpectedly far-reaching the process of adaptation to major innovations can be:

“For instance, the steam engine not only helped to pump water from coal mines, its most important initial application, but also spurred the invention of more effective factory machinery and new forms of transportation like steamships and railroads. In turn, these co-inventions helped give rise to innovations in supply chains and mass marketing, to new organizations with hundreds of thousands of employees, and even to seemingly unrelated innovations like standard time, which was needed to manage railroad schedules.”

Electricity followed a similar pattern. The adoption of electricity in American companies took several decades to reach critical mass and was initially driven by cost savings, write Brynjolfsson and co:

“The biggest benefits came later, when complementary innovations were made. Managers began to fundamentally re-organize work by replacing factories’ centralized power source and giving every individual machine its own electric motor. This enables much more flexibility in the location of equipment and made possible effective assembly lines of materials flow.

“This approach to organizing factories is obvious in retrospect, yet it took as many as 30 years for it to become widely adopted.”

The current way of doing things can often blind us to alternatives. While the exciting new piece of technology tends to get all the attention, it’s often the rethinking of processes enabled by the new technology that hold the promise of real productivity gains.

“The winners out of technology shift will be those with the lowest adjustment costs — partly a function of luck.”

Those changes might not be obvious at first, but one day someone will realise the way it’s always been done is no longer the best way. Such adaptations typically end up costing several times as much as the initial investment in the technology, the authors add.

The time lag and planning required in implementing technology change can be easily overlooked in the excitement around the shiny new innovation, but they have significant effects. The winners out of technology shift will be those with the lowest adjustment costs — partly a function of luck — and who put as many of the complementary measures in place as possible. Change will happen, but it’s hard to know how long it will take and how things will look at the end.

So perhaps big data and AI will reshape your job — it’s just that you might be working elsewhere by the time it happens.

Image source: 1922 drawing by Hugo Gernsback of a city of the future on Wikimedia Commons.

Author Bio

David Donaldson

David Donaldson is a journalist at The Mandarin based in Melbourne. He's previously written for The Guardian and Crikey and holds a masters in international relations.