The Information Commissioner’s Office has announced its intent to fine Clearview AI over £17 million for harvesting UK citizens’ images and conducting facial recognition searches using biometrics.
The decision to impose a major fine on face recognition company Clearview AI results from a joint investigation carried out by the ICO and the Office of the Australian Information Commissioner (OAIC). The two authorities came to the conclusion that the manner in which the company obtained facial images of people and processed them for profit on behalf of hundreds of clients, was clearly in violation of GDPR and the UK data protection law.
“I have significant concerns that personal data was processed in a way that nobody in the UK will have expected. It is therefore only right that the ICO alerts people to the scale of this potential breach and the proposed action we’re taking,” said Information Commissioner Elizabeth Denham.
“Clearview AI Inc’s services are no longer being offered in the UK. However, the evidence we’ve gathered and analysed suggests Clearview AI Inc were and maybe continuing to process significant volumes of UK people’s information without their knowledge. We, therefore, want to assure the UK public that we are considering these alleged breaches and taking them very seriously.”
Headquartered in New York, Clearview AI says its role is to leverage facial recognition technology to support law enforcement and government agencies “to identify victims and perpetrators in order to safeguard their communities and secure industry and commerce.” The company says it runs a database of over 10 billion facial images sourced from public-only web sources, including news media, mugshot websites, public social media, and other open sources.
It is a fact that Clearview AI’s facial recognition services are used by a large number of law enforcement agencies, both in the US and in the UK. In February 2020, the company suffered a major security incident that exposed data belonging to almost 2,200 organisations ranging from law enforcement authorities to private companies that were part of Clearview’s client list.
The client list of the company ranged from US Immigration and Customs Enforcement, the Department of Justice, the FBI, Macy’s, to thousands of government agencies who use the company’s software for facial recognition. According to Buzzfeed, not only law enforcement and government entities but retailers like Walmart, Best Buy and Macy’s were in their compromised client list too as they had signed paid contracts with Clearview AI to create a global biometric identification system.
In its notification in which it announced its intent to fine the AI company, the ICO also admitted that several law enforcement agencies in the UK also availed the company’s services in the past. “The ICO understands that the service provided by Clearview AI Inc was used on a free trial basis by a number of UK law enforcement agencies, but that this trial was discontinued and Clearview AI Inc’s services are no longer being offered in the UK,” it said.
The information security watchdog noted that Clearview AI collected images of UK citizens from the Internet despite not having a lawful reason to do so, failed to inform people about what it was planning to do with their images, did not have a process in place to stop the data being retained indefinitely, and did not meet the higher data protection standards required for biometric data.
“The covert collection of this kind of sensitive information is unreasonably intrusive and unfair. It carries significant risk of harm to individuals, including vulnerable groups such as children and victims of crime, whose images can be searched on Clearview AI’s database,” said Australian Information Commissioner and Privacy Commissioner Angelene Falk.
“When Australians use social media or professional networking sites, they don’t expect their facial images to be collected without their consent by a commercial entity to create biometric templates for completely unrelated identification purposes.
“The indiscriminate scraping of people’s facial images, only a fraction of whom would ever be connected with law enforcement investigations, may adversely impact the personal freedoms of all Australians who perceive themselves to be under surveillance,” she added.
Commenting on the fine issued against Clearview AI, Lucie Audibert, Legal Officer at Privacy International, said: “Clearview AI’s business enables unprecedented surveillance of our online and offline lives. We have laws against this kind of interference with our fundamental rights, and regulators are finally starting to right these wrongs.
“To investors who recently committed $30 million of funding to Clearview’s perilous business, this decision should be a wake-up call.”