Australian police want to improve online child abuse investigations; cue Clearview fears
A media furor has broken out in Australia over the potential use of Clearview AI’s facial recognition by law enforcement in the country investigating child exploitation.
Jon Rouse, former operations manager for the Australian Federal Police’s (AFP’s) Australian Centre to Counter Child Exploitation (ACCCE) is now the interim victim’s commissioner for Queensland. He told the Herald Sun that Clearview’s biometrics could give police a major advantage in their investigation of online child abuse images. Rouse has been making the rounds in Australian media following a joint investigation with American police who are able to use Clearview, telling 7News Australia that the operation was successful, but Australian law enforcement is “fighting with our hands tied behind our backs.”
Rouse also arranged for a meeting between Clearview and Operation Griffin, a joint initiative including the heads of child protection units in Australia and New Zealand, last July, Crikey reports.
Clearview had already been censured by Australia’s Information Commissioner for breaching citizens’ privacy in 2021.
AFP has renewed a call for people to submit photos of themselves as children., meanwhile, so that they can train an algorithm to differentiate between images of children in safe and unsafe situations, or abusive and non-abusive ones. The agency is seeking about 100,000 images of people up to age 17 from all ethnicities.
Sydney Criminal Lawyers, a law firm with offices throughout the country, relates the My Picture Matters program with the engagement between AFP and Clearview. That article says Clearview has sold its technology to businesses and governments around the world, and appears to refer to a meeting earlier in 2023.
Article Topics
Australia | biometrics | Clearview AI | facial recognition | police
Comments