Clearview AI, the controversial facial recognition company, announced that someone with “unauthorized access” hacked sensitive information, including its entire client list. The organisation has scraped billions of ‘open source’ images from social media and the internet and partnered with more than 600 law enforcement agencies including the FBI. Privacy campaigners are, likely, experiencing some schadenfreude.
Tor Ekeland, attorney for Clearview, said in a statement: “Security is Clearview’s top priority. Unfortunately, data breaches are part of life in the 21st century. Our servers were never accessed. We patched the flaw, and continue to work to strengthen our security.”
In addition to the company’s client list, information about how many searches clients made and the accounts they’d set up is included in the leaked information. Clearview said that the company’s vast database of images wasn’t accessed.
“It’s unsurprising that they have been the target of attempts to compromise their security, as the data that appears to have been accessed would be both commercially valuable and also valuable in the ongoing PR battle over the legitimacy of the use of facial recognition technology,” Andrew Charlesworth, professor of law, innovation and society at Bristol University, told NS Tech.
“There are thus various possibilities as to attackers: someone who wants to hold the information to ransom, someone who wants to market the information to potential competitors of Clearview AI, or someone who wants to release the information to focus public attention on Clearview AI’s activities and clients. If the motive is one of the first two options then it may be that the data never sees the light of day. If the third option, then the data is likely to be released at some point either via social media, via selected press or civil society groups, or via an entity such as Wikileaks.”
At present, there’s no indication of who might be behind the attack, but the client list is likely of interest to privacy organisations and the general public, should it surface somewhere on the dark web in the coming weeks.
“It was interesting that Clearview AI have been at pains to deny that there was a hack of their systems, and that the compromise is described in terms of an unauthorised access resulting from a flaw. That might suggest an opportunistic insider compromise,” added Charlesworth.
A New York Times (NYT) report rocketed Clearview AI to infamy last month, detailing how it scraped the likes of Facebook and Twitter to accrue a vast data set of the world’s faces. Law enforcement agencies are using the startup’s facial-recognition software to scour this vast data collection to identify suspects.
Earlier in February, Facebook “demanded” that Clearview AI stop using data scraped from its social networks because it violated their policies, CBSNews reported. LinkedIn, Twitter and YouTube went even further, sending cease-and-desist letters to the company.
Clearview AI claimed that the Indiana State Police were able to crack a case within 20 minutes by using the app. But in the wake of the NYT piece, some police departments have sought to distance themselves from the firm. The NYPD for example confirmed that it has no formal relationship with Clearview AI.
A class action lawsuit has been brought against the firm in Illinois, alleging that Clearview’s actions are a threat to civil liberties and that the company broke the Illinois Biometric Information Privacy Act (BIPA) that safeguards state residents from having their biometrics data used without consent.
A document obtained by BuzzFeed revealed that Clearview has been boasting about “rapid international expansion” to prospective clients using a map that highlighted how it either has expanded, or aims to expand, to at least 22 more countries, some of which have tarnished records on human rights abuses.
CEO Hoan Ton-That has claimed his startup hasn’t done anything wrong. Since all the images were freely available on the web, Clearview has a “First Amendment right to public information,” he said in a telly interview earlier this month. This, however, is untrue.