Clearview AI

Summary

  1. Stakeholders & impact

  2. Relationship between the problem and people, society, business, and planet

Themes

  • Excessive surveillance

  • Government regulation & oversight

  • Transparency

  • Consent

  • Algorithmic bias

The wild-west of facial recognition & the end of public anonymity

Imagine a world where you could readily identify a person with nothing more than a photo. With only a few taps, you could discover where they live, the businesses they frequent, and everything they've ever posted on the internet. Seems pretty creepy, right? The reality is, this experience already exists. Tech giants like Google, Amazon, and Microsoft possess the capabilities to bring the aforementioned experience to life but have refrained from doing so because of its radical erosion of privacy.

Big-tech's responsibility in the absence of systematic federal regulation is admirable. Still, their virtuous actions haven't dismayed outlaws from blazing a trail into the wild, wild west of facial recognition powered surveillance.

The most infamous of the bunch is Clearview AI, a self-proclaimed "mission-driven" startup that offers a facial recognition tool to law enforcement agencies.

Now, the use of facial recognition by law enforcement isn't revolutionary --they've been using the technology for over 20 years -- but they've been limited to searching government databases. Clearview AI broke the mold by offering law enforcement agencies their first-ever public database. A database that consists of over 3 billion images scraped from every nook and cranny of the internet without their owner's consent.

Today, Clearview AI is used by 2,400 law enforcement agencies with virtually no oversight or knowledge of how it is used on a day-to-day basis.

In this case study, we'll examine Clearview AI's actions through the lens of responsible design to shed light on the potential positive and negative societal implications.

Stakeholders & Impact

Cohorts affected by the unregulated use of Clearview AI's facial recognition app by law enforcement agencies

Citizens

Positive impact:

  • The promise of a safer communities

Negative impact:

  • Risk of being passively spied on without any reasonable suspicion or consent

  • Discourages nonconformist behavior, free association, right to freedom of assembly and expression without the fear of being tracked

Marginalized Minority Groups

Positive impact:

  • The promise of a safer communities

Negative impact:

  • At greater risk of unlawful arrest and discrimination due to algorithmic bias

  • At greater risk of enhanced racial profiling by law enforcement

Law Enforcement

Positive impact:

  • Ability to search nationwide database of images to identify suspects of heinous crimes

Negative impact

  • Risk of being sued for wrongful arrests based on algorithmic misidentification

  • Lack of transparency amplifies public's mistrust

Clearview AI

Positive impact:

  • Lack of federal facial recognition regulations and oversight permits Clearview AI to use their product in any way they see fit without consequences

  • Positioning themselves as a mission-driven company and targeting law enforcement increases likelihood of securing lucrative government contracts

Negative impact:

  • State privacy laws hinder potential growth

Relationships between the problem, the people, society and the planet

The tensions that exist

Technology advancement <> lack of regulation

Legislation and government oversight have not kept pace with the speed of facial recognition innovation. As a result, we've relied on industry efforts to self-police, but the money on the table is too enticing for startups like Clearview AI to ignore --The move fast and break things mentality is the status quo.

Lack of regulation <> law enforcement autonomy

The absence of any kind of systematic federal regulation or permitting process has left individual police departments to decide how to use facial recognition and what to share with the public.

As a result, federal and state law enforcement have limited knowledge of how facial recognition works and its potential pitfalls -- leading to misuse, abuse and racial profiling.

Law enforcement autonomy <> societal trust

There may be benefits to law enforcement's use of facial recognition, but federal and state departments have chosen to keep their use of facial recognition in an information vacuum.

Their lack of transparency and accountability has exacerbated society's already existing mistrust in law enforcement institutions. The backlash against the use of facial recognition in law enforcement will likely grow more robust, no matter the potential upside.

Privacy concerns <> big tech response to Clearview AI image scraping

Data privacy and consent is a touchy subject for big tech. Many of their business models rely on the monetization of user data.

Although they unanimously submitted cease and desist orders to Clearview AI, no legal action has been taken. It appears that they do not want to provoke the "user privacy" monster that looms in the shadows.