Australian Federal Police officers have made more than 100 searches using Clearview AI, a controversial facial recognition company that identifies members of the public by matching them with more than 3 billion photos scraped from social media.
Documents released under freedom of information law indicates the AFP used the artificial intelligence technology on behalf of foreign law enforcement agencies.
Last week the Office of the Australian Information Commissioner and the UK’s Office of the Information Commissioner launched a joint investigation “into the personal information handling practices of Clearview AI”.
The FOI documents reveal that nine members of the AFP’s Australian Centre to Counter Child Exploitation (ACCCE) registered for a free trial of the software between December 2, 2019, and January 22, 2020, when they were directed to cease.
Of the 113 documents relevant to Clearview AI, 109 were released. One was excluded and 35 redacted because they “contained information communicated by a foreign government to the AFP for law enforcement purposes” or “relate to information provided by a foreign government”.
This would suggest the AFP may have used Clearview to conduct investigations on behalf of foreign law enforcement agencies.
The New York company, co-founded by Australian Hoan Ton-That, has been used by more than 2200 agencies, companies and individuals including ICE and government-related agencies in Saudi Arabia and UAE.
A report produced for the Office of the Australian Information Commissioner says: “The number of searches of Clearview’s database of images undertaken by staff … are unavailable due to the restriction on access to the Clearview platform.”
However, internal emails admit the AFP “has run more than 100 searches”. They included persons of interest in and outside Australia.
The ACCCE and child protection triage unit officers used Clearview in investigations into child abuse materials and “to attempt to identify and locate a suspected victim of imminent sexual assault”, an executive briefing said.
Individual police officers first received emails from Ton-That in November 2019. “Search a lot,” they read.
“Your Clearview account has unlimited searches. Don’t stop at one. It’s a numbers game. Our database is always expanding and you never know when a photo will turn up a lead.
“Refer your colleagues … If you think your colleagues might want to try Clearview out for themselves, just send their names and email addresses to help@clearview.ai and we’ll sign them up to.”
In 2009 Ton-That was sought by San Francisco police for creating ViddyHo, an alleged phishing scam accused of stealing gmail-users login credentials and spamming their contacts. “We had a bug in our code,” his website claimed after it was shut down and registered on a new domain.
An officer referring their colleague to Clearview wrote: “Hey, have you heard about this app? … The user uploads an image to the app which then runs across a database of images scraped from the internet to find a match … Bit creepy but very cool.”
Another officer told a colleague on December 9, 2019: “Just out of interest I ran [a suspect’s] mugshot through the trail Clearview FR system and got a hit on his Instagram account. The FR system looks very cool.”
There is no evidence a security or privacy assessment was conducted before police began using Clearview AI.
The AFP acknowledged using the software in April in response to a question on notice from shadow attorney-general Mark Dreyfus. It had previously denied using it in several FOIs, suggesting AFP leadership may not have been appropriately informed about ACCCE’s activities.
“Maybe someone should tell the media we are using it!” an officer wrote on January 21 in emails discussing reporting on Clearview AI.
Dreyfus described Clearview AI as a “deeply problematic service”.
“The Home Affairs minister must explain whether the use of Clearview without legal authorisation has jeopardised AFP investigations into child exploitation,” he said.
“The use by AFP officers of private services to conduct official AFP investigations in the absence of any formal agreement or assessment as to the system’s integrity or security is concerning.”
An early investor in Clearview AI was Peter Thiel, the Silicon Valley billionaire and member of Trump’s transition team. Thiel founded Palantir, a data-mining and analysis company used by law enforcement and immigration agencies around the world — including the Australian Criminal Intelligence Commission, the Australian Defence Force and AUSTRAC.
It’s an interesting case. Depending on the terms of use of the social medium platform in question, images stored on the system (including those uploaded by platform users) may not be in the public domain. Unless Clearview has obtained express permission from the platform – something unlikely to be given, I suspect – image scraping may therefore be illegal (and therefore presumably not valid evidence for any prosecution).
The AFP is certainly on thin ice with this one.
Political freedom and individual liberty have been completely fetishised in the west as a fundamental virtues. It doesn’t make people any happier or bring about better government. AI is humanities best hope of refining human behaviour, but ensure maximum scrutiny, which will enable us do achieve a Utopia of peace & prosperity never before imagined. In China, facial recognition technology is widely used and almost all of the Chinese public, except for trouble makers, are perfectly fine with it. AI is crucial to suppressing undesirable behaviours and ensuring troublemakers are jailed with long sentences. I feel sorry for the AFP having to tolerate this sort of scrutiny, when their counterparts in Beijing or Moscow don’t have to endure the same irritation
This article is showcases the brilliance of independent minded journalism. AI has the propensity to radically suppress our freedom like never before in history. If we want to avoid contemporary Australia from turning into an Orwellian dystopian nightmare, we must challenge any attempts by our law enforcement agencies to act with impunity. Jeremy should move to New York City to pursue his career in journalism. The Australian market is too small and conservative to support this sort of genuine inquiry. He must also work on his abs, buy nice clothing and get some more tattoos. Down with the Morrison government, only Bernie Sanders can save us from tyranny