This strategy goes some way to explain Clearview’s recent efforts to gain public acceptance by stressing the crime-busting nature of its technology and arguing that it can be a force for good. Last month, a Clearview spokeswoman confirmed that the company had provided facial-recognition technology to the Ukrainian armed forces to help identify soldiers killed in action, to vet people at checkpoints and for other defence-related uses. The offer to Kyiv was made personally by Ton-That.
Law-enforcement pivot
Despite the legal and regulatory challenges Clearview is facing in Europe, Australia and the US, Clearview’s pivot to providing services exclusively for law enforcement and national security agencies may yet give it enough legal cover to mature into a lucrative future. Or that, at least, is the scenario envisioned by Ton-That, who said the company had cancelled database access granted to US retailers Macy’s and Home Depot, as well as to Ashton Kutcher and other individuals with a penchant for facial recognition.
Clearview’s legal strategy against the three lawsuits — with one of them playing out in Chicago, Illinois, on the back of one of the strongest privacy laws in the US — centres on the claim that law-enforcement agencies are exempt from the rules. The Illinois suits are based on alleged violations of the state’s Biometric Information Privacy Act (BIPA) — the same law that includes the hefty statutory damages that forced Facebook to pay $US650 million and TikTok $US92 million to settle.
Falling foul of BIPA would expose Clearview to the risk of substantial damages, as well business practice changes in Illinois. When added to the lawsuit in the north-eastern US state of Vermont, the image-scraping company could be facing damages claims worth hundreds of millions of dollars — well in excess of the 20 million euro ($29 million) fine imposed by Italy’s privacy regulator.
But Clearview’s recent filing in a US District Court in Chicago noted that BIPA exempts entities working as the contractor or agent of a government agency. And given that Clearview no longer provides services to retailers, casinos and sports associations, the tech company argues that the exemption is all it needs.
“All of the facial vectors that are currently used by the Clearview app were created at a time when Clearview acted solely as an agent of governmental entities,” the company said in that court filing. “Clearview’s licensed users/customers can use Clearview’s product only for legitimate law-enforcement and investigative purposes.”
What’s more, Ton-That believes that the undeniable success of his database of more than 10 billion images in helping US law enforcement catch insurrectionists and paedophiles is dampening privacy concerns, despite litigation and regulatory challenges.
It’s unclear whether the US judge hearing the case in Chicago will accept Clearview’s arguments. But that defence now appears unlikely to gain traction in the Vermont case, where the state’s attorney-general is suing Clearview over its database of images scraped from social media. The Vermont judge said Clearview wasn’t covered by Section 230 of the federal Communications Decency Act, which protects interactive online platforms from liability for third-party content.
However, if Clearview were to overcome the US-based legal challenges, the prospect of fines similar to those announced in Italy and the UK may see the tech player forced to limit its operations to the confines of the US market — something Ton-That appears ready to accept. “We do no business in countries where fines have been proposed,” he said by email, adding that the penalties “lack jurisdiction and lack due process.”
Loading
“Almost every privacy law worldwide supports exemptions for government, law enforcement and national security,” Ton-That said.
‘Solving heinous crimes’
The divide between the regulatory challenge faced by Clearview’s US operations and those in other jurisdictions is highlighted by the company’s setbacks in both Australia and Canada, where there are no parallels with the government-entity exemptions in the US.
In a June 2021 decision, the Office of the Privacy Commissioner of Canada concluded that both the country’s federal police force and Clearview itself had violated privacy law when officers used the tool to conduct searches. The ruling was followed by legally binding orders from the provinces of Alberta, British Columbia and Québec forcing Clearview to stop collecting, using and disclosing images of people in those provinces and to delete images and biometric facial arrays collected without consent. The federal Privacy Commissioner also ordered Clearview to stop providing its services in the country — a ruling that, by then, had become academic because the tech company had already withdrawn from Canada.
Meanwhile, a joint probe by Australia’s privacy regulator and the UK Information Commissioner’s Office led to two, almost identical rulings: that Clearview had breached privacy laws. In a decision echoing that of the Canadian privacy watchdog, the Office of the Australian Information Commissioner (OAIC) concluded that the country’s federal police force had also violated privacy legislation.
[Ton-That said] he respected ‘the time and effort that the Australian officials spent evaluating aspects of the technology I built’ but that he was ‘disheartened by the misinterpretation of its value to society’
Clearview CEO and founder Hoan Ton-That
The Australian Federal Police accepted the ruling but noted that the fight against child exploitation involved offenders using “sophisticated and continuously evolving operation methods to avoid detection” and, therefore, online tools needed to be part of the force’s response.
Clearview has since appealed the OAIC’s decision in the Administrative Appeals Tribunal, with Ton-That, a dual citizen of Australia and the US, saying that his company had acted in the best interests of these two countries and their people by “assisting law enforcement in solving heinous crimes against children, seniors and other victims of unscrupulous acts.” In a statement, Ton-That said he respected “the time and effort that the Australian officials spent evaluating aspects of the technology I built” but that he was “disheartened by the misinterpretation of its value to society”.
Similar concerns have been raised in New Zealand, where the national police force also undertook a trial of Clearview technology — a decision that eventually prompted an apology from police over the force’s failure to consult then-Privacy Commissioner John Edwards. Three months before Australia and the UK announced their joint investigation into Clearview, Edwards said that “the extent to which any such technology would be fit for purpose in New Zealand [was] unknown” but he would have expected to have been informed of the trial.
New Zealand Police discontinued the trial and ordered a “stock-take” of police use of surveillance technology, with the six-month review commencing in April 2021. A report published in December last year made 10 recommendations, which were immediately adopted by the country’s police force. At the top of the police department’s response was a pledge not to deploy live facial-recognition technology.
‘Overly invasive’
In Europe, Clearview’s failure to comply with both national and EU privacy requirements appears set to add significant penalties to the company’s accounts.
In the UK, the joint investigation with Australia culminated in the November 2021 announcement that the Information Commissioner would request a fine of more than £17 million ($30 million) and would ban Clearview from processing UK citizens’ data, as part of a provisional enforcement action. This followed a warning by then-UK Information Commissioner Elizabeth Denham that the rapid spread of live facial recognition, which can be “overly invasive” in people’s “lawful daily lives,” could damage trust both in the technology and in the “policing by consent” model.
In the EU, the regulatory obstacles facing any company attempting to profit out of scraping biometric data from the internet is even more stark under the provisions of the General Data Protection Regulation (GDPR), which have placed facial recognition tools under scrutiny.
Biometric data, including those generated by facial-recognition software, are considered a special category of personal data because they make it possible to uniquely identify a person. The GDPR prohibits the processing of such data unless there is explicit consent, a legal obligation or a public benefit. A dedicated framework on artificial intelligence is currently being negotiated — the European Commission’s AI Act will restrict the use of biometric tools in the Union.
However, data protection authorities across the bloc aren’t waiting for the law to be passed over coming years. Clearview is already facing probes in Greece, Austria and France following complaints filed in those countries in 2021 by a coalition of NGOs including Privacy International and NOYB. The Greek privacy watchdog started to look into potential breaches of the EU’s privacy rules last May, but at this point can’t disclose when the probe will be finalised.
New developments in France are imminent too, as the country’s data protection authority in February gave Clearview two months to respond to questions about its use of biometric data without a legal basis. The regulator also ordered the company to stop collecting and using photographs and videos of people in France and told Clearview that it must help people exercise their right to have their data erased.
Meanwhile, in 2019 the Swedish data-protection authority fined a school that tracked the attendance of a small group of students by comparing images, concluding the institution had violated GDPR provisions — a decision suggesting a tough stance on the misuse of biometric data. And last May, the data watchdog of the German state of Baden-Württemberg began investigating PimEyes, a facial recognition search engine, for its processing of biometric data.
Italy, however, is the first EU country to have probed Clearview’s practices and hit the company with a fine. The probe that culminated in the penalty began with a handful of complaints, lodged with Italy’s privacy watchdog between February and July 2021. Although names were scrubbed from the Italian-language documents published last week, the Garante per la protezione dei dati personali (GPDP) revealed that four individuals and two data-privacy advocacy groups had been behind the complaints.
‘Clearview’s only goal is to offer a search engine to allow for the search of internet images on the part of its clients.’
Hoan Ton-That
In March 2021, Clearview responded to the GPDP’s initial inquiries, saying the Italian and European Union privacy rules didn’t apply to the complainants’ concerns and, as a result, the GPDP had no role to play in the matter. Clearview said it was certain that it had no case to answer because it had employed technical remedies to ensure that no Italian IP addresses could log on to its platform — a policy it employs throughout the European Union.
The technology company also argued that it couldn’t be seen as tracing or monitoring the Italian complainants because it simply offered a snapshot in time, as would be the case with Google Search. What’s more, Clearview held no list of Italian clients and had no business interests in the country. “Clearview’s only goal is to offer a search engine to allow for the search of internet images on the part of its clients” and the facial vectors contained in its database can’t be used to link an image to other personal data, Ton-That said.
The San-Francisco based founder of the company also said he was prepared to accept regulation — provided it be firmly based on Clearview’s role as a search engine of facial images. What’s important is that the regulation “makes sense … as this new technology finds its place in the crime-fighting universe,” Tom-That said.
Laurel Henning, Cynthia Kroet, James Panichi and Mike Swift report on regulatory affairs for LexisNexis’ MLex.
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Business News Click Here
For the latest news and updates, follow us on Google News.