Every human has unique fingerprints that are easy to access, so fingerprints are an obvious way of performing biometric identification. Fingerprint biometrics date back to the late 19th century, when French and British law enforcement agencies began using fingerprints to identify suspects at crime scenes (Jain). Since then, the usage of fingerprint biometrics for forensic purposes has expanded dramatically--in 2013, over 60 million fingerprints were processed and used for identification ("Next Generation Identification (NGI) Monthly Fact Sheet"). Additionally, fingerprints can also be used for authentication. In 2013, Apple introduced fingerprint biometrics as a method of authentication on its iPhone 5S. Instead of setting a password to unlock their phone, users could use a capacitance-based sensor that can identify their fingerprints to unlock their phone (Bonnington). With the advent of this technology, fingerprint biometrics became mainstream, raising ethical questions in their wake.
From a technical perspective, fingerprint recognition systems are fairly straightforward. A recording of the fingerprint is taken, and features are extracted and either stored or compared against stored features. There are several different types of input to fingerprint recognition systems: some take images of the fingerprint, while others use optical or capacitive sensors to record the fingerprint (Maltoni). Other systems combine these technologies with sensors such as temperature sensors in order to differentiate between a real and fake fingerprint (Jain).
There are two main applications of fingerprint biometrics: identification and authentication. Law enforcement agencies have been using fingerprints as an identification mechanism for over a century by law enforcement agencies (Jain). Agencies such as the FBI can capture fingerprints found at crime scenes and compare them to a database of known fingerprints, in order to identify and convict suspects ("Fingerprint Identification"). Fingerprints are also taken as part of arrest procedures and are included in Identity History Summaries, commonly known as criminal records. They can then be used to do background checks on employees by comparing an employee's fingerprints to his or her Identity History Summary ("Identity History Summary Checks"). Several states, including California, Colorado, Georgia, Hawaii and Utah, also require fingerprinting prior to issuing driver's licenses (Klimas), and the United States Department of Homeland Security also requires fingerprints of international visitors to the United States ("Fact Sheet: Upgrade to 10-Fingerprint Collection"). In addition, the European Union is currently enacting fingerprint identification at its borders (Waterfield, Davies).
Currently, the most visible application of fingerprint recognition systems for authentication is Apple’s Touch ID system, which, as detailed above, allows users to unlock their phone using their fingerprints (Bonnington). Fingerprints are also used as authentication at ATMs throughout the world (Spence). There are also fingerprint sensors that can be used with computers as authentication systems, but these are much more rare (Alpert).
There are several characteristics of fingerprint biometrics that merit ethical discussion.
Firstly, fingerprints are incredibly easy to use as a method of authentication: all the user has to do is touch their finger to a sensor. This encourages people who would otherwise not use an authentication system at all to secure their devices. From a utilitarian perspective, this is a good thing because it gets more users to use authentication. On the other hand, this can create a false sense of security in users, who might choose fingerprints over a more secure type of authentication, or who may choose not to use other types of authentication in addition to fingerprints (Wright). From a deontological perspective, this is a bad thing: it is the duty of an authentication system to provide actual protection to a user’s data, but fingerprint systems may only provide a false sense of security.
Secondly, fingerprints can be faked. There is a high barrier to faking fingerprints, but it is possible (Wright). For example, when Apple’s TouchID was released, a group of hackers called the Chaos Computer Club released details of a hack days after the iPhone 5S came out (Frank). From a deontological perspective, it is the duty of the creators of technology to provide authentication systems that cannot be broken, so it is wrong for creators to enable users to use authentication systems that can be broken. From a utilitarian perspective, if the end goal is to protect users’ data, this is similarly bad because the system fails to achieve the goal of protecting the data.
Thirdly, fingerprints are unique to the individual. This is a good thing for authentication and identification purposes: the data used to identify or authenticate the user is complex, unique, and unpredictable, making it harder to hack (Jain). From a utilitarian perspective, this is good, because it means that any user using fingerprints is, by default, using a secure identifier, meaning the security of the system is greater. It also allows law enforcement agencies to identify perpetrators at a crime scene, which can ensure conviction of criminals and a reduction of crime overall ("Fingerprint Identification"). However, from a deontological perspective, this uniqueness can be dangerous. The fact that fingerprints are uniquely identifiable means that a person can, theoretically, be identified from anything that they touch (“Sen. Franken Questions Apple on Privacy Implications of New iPhone Fingerprint Technology.”). People could be tracked and identified using their fingerprints, and hiding from this kind of privacy invasion would be nearly impossible (“Sen. Franken Questions Apple on Privacy Implications of New iPhone Fingerprint Technology.”). Therefore, according to deontology, fingerprint biometrics fails its duty to protect the privacy of the users.
Lastly, fingerprints are immutable. Once a user's fingerprints have been discovered by a malicious actor, or put publicly on the internet, they cannot be changed (Alterman). Like with the problem of uniqueness, from a utilitarian perspective, this is a good thing for identification and persecution of criminals ("Fingerprint Identification"). However, from a user’s perspective, we once again have a problem with privacy. If a user is using fingerprints to protect their bank account or governmental records, and those fingerprints somehow get released to the world, that protection is no longer secure. More terrifyingly, the user has no way to change his or her authentication method--meaning that the user’s data can never be re-secured if fingerprint protection is the only method of protection available (Alterman).
Aside from these main issues, there are other ethical problems with fingerprint biometrics: Fingerprints cannot be lost or stolen (Jain), but it is far easier to force someone to give you their fingerprint than it is to force them to give you their 15-digit password (Frank). This is bad from a deontological perspective, because it makes theft easier, going against a deontological duty to deter theft and criminal behavior. Since we’ve been using fingerprints since the late 19th century, fingerprints can be identified to a very high accuracy rate (Jain), which is a good thing from both a deontological and utilitarian perspective, because it means that fingerprint sensors are secure. If a fingerprint sensor is damaged or scratched, all fingerprint readings can be malformed (Bonnington). Fingerprints can also change with conditions: if the user’s hands are dry or oily or sweaty, their fingerprints can appear different (Bonnington). These challenges are bad from a deontological perspective, because they mean that the fingerprint sensors may not always work correctly, which can create a frustrating user experience at best and can lock someone out of their device at worst.
All of these issues show us that, from both deontological and utilitarian perspectives, there are some characteristics of fingerprint biometrics that are clearly positive, and some that are clearly negative. Fingerprint biometrics can be used as a unique and accurate method of identification, which is positive from a utilitarian perspective because it allows for both secure authentication for users and identification of criminals. However, from a utilitarian perspective, the fact that fingerprints can be found anywhere and can be duplicated easily means that, with enough effort, the security and uniqueness of fingerprint biometrics can be broken, rendering them useless. From a deontological perspective, fingerprints provide opportunities for privacy violations, both because a user can be tracked and identified by their fingerprints and because, once a fingerprint is discovered by a malicious attacker, the fingerprint is useless forever as a method of identification and authentication.
We believe that fingerprints should be used for identification but not for authentication. It should be legal for fingerprints to be used to identify and prosecute criminals, and it should be legal to use fingerprints as one form of identification. For authentication, fingerprints should never be used as a single-factor authentication system. Fingerprints should not be used in isolation for identification or for authentication, especially for sensitive use cases. At government-controlled locations where identification is necessary, such as at border control and at the DMV, people should be required to present photo identification along with their fingerprints. When unlocking devices or accounts containing sensitive information, fingerprints should never be used alone, and should be used as a second factor in conjunction with a password or pin.