The Australian privacy commissioner will investigate whether a facial recognition app utilised by 600 American police departments is being used by police in Australia.
Facial recognition software Clearview AI allows law enforcement to use a photo of a suspect to search through a database of 3 billion images from across the internet. The technology has triggered human rights concerns from academics and privacy experts.
Australian founder Hoan Ton-That told The New York Times his app is being used by more than 600 police departments in the US.
He recently told the ABC it is also being used in Australia.
“We have a few customers in Australia who are piloting the tool, especially around child exploitation cases,” he said.
The Office of the Australian Information Commissioner has confirmed it has launched an inquiry into whether the software is being employed in Australia, or if its database contains information on Australians.
“The OAIC has commenced making inquiries with Clearview AI about whether Australians’ personal information is being used in its database for the purposes of facial recognition and if it is being used in Australia,” an OAIC spokesperson told InnovationAus.
“Once those inquiries are completed, the OAIC will determine whether further action is required.”
Information and privacy commissioner Angelene Falk previously told the ABC she was making inquiries into “whether or not Australians’ data is implicated”.
She noted Australian police departments hoping to use Clearview AI’s technology would have to go through the required processes, and she would have oversight of its use.
“The Australian privacy principles would require them to conduct a privacy impact assessment, that means looking at what are the privacy risks, how would the information be handled and how those risks would be mitigated,” she said.
Victorian and NSW police departments have confirmed their use of facial recognition technology, but have not revealed what software they employ.
In October, proposed laws to allow the sharing of biometric information between federal and state governments were rejected after backbench MPs called for a rewrite over privacy and transparency concerns.
The bill was first introduced in 2018, in a bid to curb identity crime with a national facial biometrics matching database. It would allow agencies to use the government’s facial verification service and facial identification service to share and match facial images in existing databases.
The laws were met with concern from human rights groups, Digital Rights Watch and the Human Rights Law Centre.