By: Jordan McNea and Apsara Rodriguez
A new company threatens privacy in a way that has never been seen before. Clearview AI is an astonishing new technology company which offers a groundbreaking facial recognition app that has scraped three billion images from social media and across the internet. Law enforcement agencies in the United States and Canada, including the FBI, have become enthusiastic clients, enlisting the company’s help in identifying suspects. This use has led to many outcries of this being an invasion of privacy, and that policing should not be based on facial recognition technology. While current uses of this technology offer benefits to society, facial recognition has the power to completely erode away privacy as we know it today. In this blog post, we will start by explaining Clearview AI and the uses of facial recognition technology, then move into pros and cons for both use in policing and use for public consumption.
What is Clearview AI?
Clearview AI is collecting pictures from social media channels and other platforms, nearly always going against the Terms of Service of the sites they scrape., The clients of Clearview can upload a photo of an unknown person to the system and the app can identify someone from an image in seconds. It matches the faces of unknown people to their online photos and the sites where the images originally came from. The results are about 75% accurate. Law enforcement is their largest served industry with about 600 agencies using the app. Clearview claims to have scraped images from Facebook, YouTube, Venmo and has amassed a database of three billion images. According to CEO Hoan Ton-That, Clearview facial recognition technology is available only to law enforcement right now and it is to be used to identify potential criminals. But in February, it was disclosed that Clearview AI’s security measures were breached which led to the disclosure of the company’s lists of clients which included Walmart, Macy’s, Kohl’s and the NBA. By selling people’s personal information to different institutions, Clearview is already violating privacy. Moreover, the problem is people are not opting into such sharing. If a person has posted images on the internet then she is probably in the Clearview database, and her privacy is already violated.
The state-of-art in criminal investigations is to take fingerprints, scroll through surveillance cameras, and extract DNA from biological samples. These techniques are harder to use and are much less accurate than Clearview’s image surveillance. But could also be used to identify every person who attended a political rally or protest. As the New York Times points out, this technology could also be used to blackmail people. Anyone can be recorded in public, their identity determined from their faces, and then blackmailed unless they paid up.
Facial Recognition for Policing
Facial recognition technologies such as Clearview AI have become popular amongst police department around the country as a tool to quickly solve crimes—sometimes within seconds of a search. The use cases for this technology from a policing standpoint range from catching shoplifters to identifying John Doe’s to even solving murder cases. Perhaps the most notable example is the Anne Arundel County police using facial recognition to catch the gunman who entered the Capital Gazette newsroom and murdered five people and injuring three others. Due to the lengthy time it take the system to identifying fingerprints, the police department decided to try Clearview’s facial recognition technology in this newsroom murder and found a match within minutes. Since Clearview AI collects mainly social media pictures to run searches against, it can find individuals that no prior criminal records and wouldn’t be found in traditional police department photo databases. Hence, this technology is very enticing to those tasked with solving crimes.
Proponents of the software will point to the successes of criminal apprhension and claim that getting suspected criminals off the street faster means safer communities. This technology is completely legal at the federal level, and sparsely legislated at the state or local level. To drive home the legality of the app, Clearview hired Paul Clement, the former United States solicitor general under George W. Bush. When making their pitch to the Atlanta Police Department, Clement told the agency that the technology, “[does] not violate the federal Constitution or relevant existing state biometric and privacy laws when using Clearview for its intended purpose.” Law enforcement agencies will generally not publicize the use of this already secretive technology; as one law enforcement agent said, “it’s difficult for us to be transparent, because the more transparent we are, the more questions are raised.” So for law enforcement agencies, and those that have a keen interest in keeping their community safe, the use of this technology is an obvious solution with the potential to revolutionize crime fighting.
Those against sacrificing privacy to the use of facial recognition software being used as a policing tactic are quick to point out the issue that bias plays in these tools. The technology industry has long been plagued by lack of diversity, with jobs being disproportionately given to white males rather than racial minorities and women. With diverse voices and life experiences missing from production teams, software companies often don’t consider the needs of—and more importantly, the dangers posed to—people from possible biases in development. A 2018 study showcased this bias when three of the leading facial recognition tools were found to misidentify the gender of darker skinned women around a third of the time, compared to only one percent for white males. ACLU ran a similar test that found Amazon’s Rekognition tool, a competitor to Clearview AI, misidentified 28 members of Congress as criminals, with a disproportionate amount of those false matches belonging to black and Latinx lawmakers. Given the racial discrimination that all too often can be found in the police departments meant to serve areas with large minority populations, this software can exacerbate such problems. Opponents will also point out that when an individual officer is discriminating against a group of people, it is much easier to identify and correct the behavior than when the behavior is locked in a discriminatory machine learning algorithm.
Facial Recognition for Public Consumption
Clearview’s technology frightens many because of the potential to abuse it with the taboo of facial recognition technology slowly wearing off, it’s easy to imagine a situation where facial recognition can be built into a pair of glasses where users can identify the name, occupation, phone number, and address of strangers walking past in the street in real time. Privacy advocates believe that stripping away public anonymity could lead to an increase in violence against women. Making it easier for men to stalk and cause physical harm.
It’s nearly impossible to find people who are willing to state publicly that facial recognition tools should be available to the public. Despite the concerns, Clearview offers two arguments in support. David Scalzo, founder of Kirenaga Partners—an early investor in Clearview AI—states,
“I’ve come to the conclusion that because information constantly increases, there’s never going to be privacy. Laws have to determine what’s legal, but you can’t ban technology. Sure, that might lead to a dystopian future or something, but you can’t ban it.”
While laws can, and have, banned technology from reaching the hands of consumers in the past, such as Huawei’s 5G networks due to fears of the China’s ability to use the technology to spy on American citizens, this idea resonates with those who don’t stand to lose much with the use of this technology.
Another popular idea amongst those advocating for the advancement of facial recognition is that bad actors are bound to weaponize this technology one way or another, so the “good guys” should aggressively enter this space as well. That is Clearview CEO, Hoan Ton-That’s, viewpoint. When asked about making the app available to the public and the dangers that it poses, Ton-That tried to play coy by saying, “there’s always going to be a community of bad people who will misuse it,” before going on to say, “our belief is that this is the best use of the technology.”
Technological innovations often occur faster than public discourse and legislation can keep up, but when those innovations threaten the privacy of millions of people, it is imperative for the public to consider the ramifications. By scraping three billion images from social media sites, Clearview AI has positioned itself as the leader in facial recognition and disrupting privacy. While currently serving only law enforcement agencies and security for a handful of large companies, the American people need to reckon what a future without public anonymity could look like before the full power of facial recognition software is unleashed. Some believe it is a worthwhile tool for police department and agencies like the FBI and Homeland Security to have in order to protect citizens; others view this as a way to further increase the racial inequalities already seen in policing. The future of this technology comes down to how much outcry there is over the loss of privacy. Some have already accepted a dystopian future without privacy, so it is up to privacy advocates to sound the alarm about the reach this technology has on all our daily lives.
(n.d.). Retrieved May 04, 2020, from https://www.facebook.com/apps/site_scraping_tos_terms.php
Cava, M., & Weise, E. (2018, June 29). Capital Gazette gunman was identified using facial recognition technology that’s been controversial. Retrieved May 04, 2020, from http://www.usatoday.com/story/tech/talkingtech/2018/06/29/capital-gazette-gunman-identified-using-facial-recognition-technology/744344002/
Ghaffary, S. (2019, December 10). How to avoid a dystopian future of facial recognition in law enforcement. Retrieved May 04, 2020, from https://www.vox.com/recode/2019/12/10/20996085/ai-facial-recognition-police-law-enforcement-regulation
Hill, K. (2020, January 18). The Secretive Company That Might End Privacy as We Know It. Retrieved May 04, 2020, from https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html
Huawei has been cut off from American technology. (n.d.). Retrieved May 04, 2020, from https://www.economist.com/business/2019/05/25/huawei-has-been-cut-off-from-american-technology
I Got My File From Clearview AI, and It Freaked Me Out. (n.d.). Retrieved May 4, 2020, from https://onezero.medium.com/i-got-my-file-from-clearview-ai-and-it-freaked-me-out-33ca28b5d6d4
Initiative, P. (n.d.). Police stops are still marred by racial discrimination, new data shows. Retrieved May 04, 2020, from https://www.prisonpolicy.org/blog/2018/10/12/policing/
Ng, A. (2020, February 26). Clearview AI’s entire client list stolen in data breach. Retrieved May 04, 2020, from https://www.cnet.com/news/clearview-ai-had-entire-client-list-stolen-in-data-breach/
O’Flaherty, K. (2020, February 29). Clearview AI’s Nightmare Just Got Worse: Here’s Why It Matters. Retrieved May 04, 2020, from https://www.forbes.com/sites/kateoflahertyuk/2020/02/28/the-clearview-ai-nightmare-just-got-worse-heres-why-it-matters-and-what-must-come-next/
Police: ‘Person of interest’ in custody after shooting of 2 Anne Arundel Co. detectives. (2020, February 07). Retrieved May 04, 2020, from https://wtop.com/anne-arundel-county/2020/02/2-anne-arundel-co-detectives-shot-during-chase-near-baltimore/
Romm, T. (2018, July 26). Amazon’s facial-recognition tool misidentified 28 lawmakers as people arrested for a crime, study finds. Retrieved May 04, 2020, from http://www.washingtonpost.com/technology/2018/07/26/amazons-facial-recognition-tool-misidentified-lawmakers-people-arrested-crime-study-finds/
Sargent, J. (2019, June 10). There’s a diversity problem in the tech industry and it’s not getting any better. Retrieved May 04, 2020, from https://sdtimes.com/softwaredev/theres-a-diversity-problem-in-the-tech-industry-and-its-not-getting-any-better/
Shwayder, M. (2020, January 27). Clearview AI Facial-Recognition App Is a Nightmare For Stalking Victims. Retrieved May 04, 2020, from https://www.digitaltrends.com/news/clearview-ai-facial-recognition-domestic-violence-stalking/
Twitter Terms of Service. (n.d.). Retrieved May 04, 2020, from https://twitter.com/en/tos