Taking a Clearview at Ethics

by Jude Tomas

Clearview has recently emerged as one of the most successful, albeit controversial, artificial intelligence (AI) startup companies. Its revolutionary software, dubbed Clearview AI, has a database of over 3 billion photos - almost 8 times larger than the FBI ’s database - and allows a person to find all public photos of a particular person simply by uploading a picture of them. However, controversy has arisen because of its potential intrusion of privacy; the software scans for every single photo available of a certain person, taking photos from Facebook, YouTube, Venmo and other websites. However, the company still sells its software, claiming that it does not violate any privacy laws and that law enforcement officers and security personnel constantly observe the software for suspicious activity. This is not the case. On Feb. 28, Apple disabled the iOS application of Clearview AI after it was found that it violated rules around app distribution. The implications of this are tremendous, as customers can no longer use the software on their iPhones. Nevertheless, the software has already been downloaded and used by many, and the future of individuals' anonymity remains unclear.

Ever since the capabilities of Clearview AI’s facial recognition system were made known, over 600 law enforcement departments have begun employing the use of the software, according to data from Clearview AI. Law enforcement officers have stated that Clearview AI assisted them in solving shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.

Clearview AI has not only been sold to law enforcement agencies, however. Clearview itself recently stated that it has also sold its software to companies for their “security personnel.” In a recent database leak of Clearview’s clients, it was revealed that over 2,200 private entities, which include Walmart, Macy’s and the National Basket Association, have possession of the software. None of these are involved with law enforcement whatsoever. An important thing to note is that Clearview has recently been exposed for violating Apple’s Terms and Conditions; this means that Clearview has been sidestepping the Apple App Store by encouraging those who wish to use the software, who are not law enforcement, to download the app that is supposed to be reserved exclusively for developers. Therefore, Clearview has largely been dealing with private entities despite its claim that the software’s primary use is for law enforcement. On top of that, it is dealing with other entities under the table of Apple’s policies. In other words, no one knows what their clients are doing with the software. This only heightens the already prominent suspicions of Clearview.

The current consensus is that Clearview will most likely have their license revoked permanently, although according to Hoan Ton-That, the founder of Clearview, the company is “working on complying to Apple’s terms of service.” Regardless, it is clear that through whatever third-party app Clearview is using, private companies are using this seemingly limitless software for unknown, potentially malicious reasons. There are no federal laws that regulate the use of facial recognition, and thus, Clearview is giving out free trials of its software. Given this, the future of individual privacy remains hazy; it may soon become impossible to walk down the street without having your face, name and other information taken by strangers.