Data and discrimination: opaque tech preventing effective regulation

By David Donaldson

June 3, 2019

As more of our lives moves online, there is increasing risk of automated discrimination and customer manipulation using personal data, argues a consumer protection group.

But policymakers will find it difficult to combat such problems without improved transparency.

“While targeted advertising may be annoying, the real risks are discrimination, exclusion and manipulation,” claims a new report from the Consumer Policy Research Centre.

And it’s not just theoretical — discriminatory and exclusionary harms from online profiles “are already happening”, says the think tank.

“Consumer profiles are used to support automated decision-making in finance, insurance, employment and other industries. For example, some organisations are using data including location, purchase histories, web search, and social networks information to build creditworthiness profiles of individuals.”

Insurance companies have reportedly started using social media profiles to inform insurance access and cost, while insurer MLC excluded a consumer from mental health coverage in life insurance due to her accessing mental health services for the sexual abuse she suffered as a child in the mid-1980s.

Last year gay dating app Grindr was found to be sharing users’ HIV status with third parties — a practice now ceased — while fertility app Ovia sells de-identified health information to users’ employers.

Consumers “feel overwhelmed by privacy policies and have limited understanding or control over their personal data”, says the centre.

Unfortunately, it’s currently hard to know how big the problem is, and thus how to fix it.

“The data collection landscape is largely opaque. Consumers don’t understand what they are handing over or the value of that data,” says the CPRC.

“Policymakers will find it difficult to design effective remedies unless the supply chain governing data collection, sharing and use market is more transparent.

“… Transparency will be the building block for any policy solution to address harms from data collection sharing and use and to support the beneficial outcomes of these practices.”

About the author
Inline Feedbacks
View all comments
The Mandarin Premium

Insights & analysis that matter to you

Subscribe for only $5 a week


Get Premium Today