Tech companies might be more liable than they think: ACCC


Digital firms claim it’s not their responsibility how people choose to use their platforms, with Uber arguing its drivers are not employees and Facebook allowing the growth of fake news. The ACCC is questioning whether this approach is right.

Big new digital businesses like Facebook and Uber have grown up in an environment of minimal regulation, taking advantage of governments’ inability to keep up with the pace of change.

But as these businesses become large enough to play a significant role in employment, tax revenue and even politics, questions are being asked about whether this is right.

Regulators too are struggling to figure out how to deal with emerging digital businesses, but are increasingly sceptical that the current self-policing model is working.

“I also just think it’s worth saying that a lot of corporate time is going into thinking about saying well is it actually right that those platforms have no liability,” revealed Australian Competition and Consumer Commission deputy chair Delia Rickard at a regulators’ conference on Monday.

“It’s certainly something we’re looking at and questioning quite clearly.”

There are big questions around how to regulate these companies. Uber argues it is only a platform and its drivers are independent contractors who can choose when to work, but others argue this is merely a smokescreen for offering poor employment conditions and low wages. A lax approach to customer safety was cited by London Mayor Sadiq Khan as one of the reasons he chose not to renew Uber’s licence to operate. This raises questions about whether Uber’s presence is really increasing competition in the taxi industry by using a more efficient business model, or if it is just under-cutting its more heavily regulated competitors.

Plenty of companies try to game the system by not being straight with the regulators, but the complexity and opacity of tech companies’ algorithms makes it even harder for regulators to work out whether they are abiding by the rules. Less oversight raises the possibility of discrimination occurring, for example — the way Uber’s algorithm works means white neighbourhoods in Washington DC experience “consistently lower” wait times for services, researchers have found.

The ACCC is also considering whether the so-called “publishers’ defence” — a legal exemption from liability for publishing misleading or deceptive material to allow the media to report the news — applies to platforms like Facebook.

Facebook argues it is only a platform and generally should not be held accountable for what third parties do on it.

“I don’t think it should be a shield that they are free of liability, because if you want to have any kind of effective regulatory regime, you want to put the onus on the party that’s best placed to make a difference, and that party is the platform. So the message we’ve been trying to get out is get the self-policing right, get the systems right, and you’ll probably be fine,” she said.

But if they don’t take adequate action to deal with the problem, the ACCC will “really need” to look at whether stronger intervention is required.

It’s hoped that such threats will prompt companies to find fixes to the problems themselves. Massive political pressure over the detrimental impact of fake news has led to Facebook rolling out a feature called “related articles“, which will undermine the echo chamber effect of social media by providing additional perspectives on the topic the user is reading about.

Glitches in the matrix

Regulators will increasingly need to look at how to deal with technologies that automate decision making processes, argued Transport Safety Victoria’s Matt D’Abbs.

Three of the biggest new areas for future growth — the internet of things, artificial intelligence and wearable technology — all remove human agency from elements of our everyday lives — a key focus of regulators’ work.

“There’s obviously a lot of potential for this to be absolutely fantastic for regulation. It can effectively make a lot of our current approaches and interventions — like speed limits, permits, drivers’ licences, regulatory infrastructure — redundant, because the systems will take care of all of that,” he told the conference, hosted by the National Regulators Community of Practice and the Australia and New Zealand School of Government.

“But there is one thing that we can be sure of, which is that things will still go wrong. Sometimes people will be at the root of that.

“… But there is another part of that, which is system failure. … In my field of regulation we’re premised on concepts of human agency, personal responsibility and moral culpability.

“If a driverless train hits a driverless bus at an autonomous level crossing and kills 30 school children, who is to blame? How do we regulate glitches in the matrix?”

If governments decide to regulate tech industries more heavily, the work of regulators will become more complex, requiring additional skillsets. While agencies like the ACCC have traditionally focused on goods and services provided in the real world, usually within the domestic economy, they are increasingly overseeing informational or online services provided by companies located across multiple jurisdictions.

But at least they will likely be assisted by their own high tech solutions.

Who knows — perhaps one day we’ll be regulated by machines.

  • Steven Waters

    “If a driverless train hits a driverless bus at an autonomous level crossing and kills 30 school children, who is to blame?” Answer: The one which accepted responsibility for the children.

    — If the children are on the bus, the driverless bus company has accepted responsibility for their safe transport against reasonably preventable accidents (i.e. a crossing at a train line would qualify, as would any scenario in which there is likely to be conflict for the vehicle’s decision making).

    — If they’re on the train, then the driverless train company has accepted responsibility for their safe transport, including appropriate safety belts or other mechanisms that would mitigate injury in an accident.

    — If they’re at a crossing that’s operating incorrectly, whichever adult is supervising them has liability for the children, both the automated bus and train companies have public liability for third-parties outside their vehicles anyway, and the manufacturers of the crossing technology would have professional indemnity insurance or would be subject to criminal negligence claims (“You had one job”), as would whichever government department had authorised their installation as appropriate for use (i.e. either the department responsible for public transport, the department responsible for roads, and/or the local government).

    Even if the barriers were raised incorrectly, a human bus driver would check both ways before crossing the train line and wouldn’t trust either the barrier to lower correctly or the train to stop. Similarly, a human train driver would apply the brakes should they observe a barrier raised incorrectly or a bus crossing the tracks. That systems are automated doesn’t excuse liability issues — it emphasises them.