A fingerprint is an impression left by the friction ridges of a human finger. The recovery of partial fingerprints from a crime scene is an important method of forensic science. Moisture and grease on a finger result in fingerprints on surfaces such as glass or metal. Deliberate impressions of entire fingerprints can be obtained by ink or other substances transferred from the peaks of friction ridges on the skin to a smooth surface such as paper. Fingerprint records normally contain impressions from the pad on the last joint of fingers and thumbs, though fingerprint cards also typically record portions of lower joint areas of the fingers.
Human fingerprints are detailed, unique, difficult to alter, and durable over the life of an individual, making them suitable as long-term markers of human identity. They may be employed by police or other authorities to identify individuals who wish to conceal their identity, or to identify people who are incapacitated or deceased and thus unable to identify themselves, as in the aftermath of a natural disaster.
Their use as evidence has been challenged by academics, judges and the media. There are no uniform standards for point-counting methods, and academics have argued that the error rate in matching fingerprints has not been adequately studied and that fingerprint evidence has no secure statistical foundation.[1] Research has been conducted into whether experts can objectively focus on feature information in fingerprints without being misled by extraneous information, such as context.[2]
Fingerprints are impressions left on surfaces by the friction ridges on the finger of a human.[3] The matching of two fingerprints is among the most widely used and most reliable biometric techniques. Fingerprint matching considers only the obvious features of a fingerprint.[4]
The composition of fingerprints consists of water (95%-99%), as well as organic and inorganic constituents.[5] The organic component is made up of amino acids, proteins, glucose, lactase, urea, pyruvate, fatty acids and sterols. Inorganic ions such as chloride, sodium, potassium and iron are also present. Other contaminants such as oils found in cosmetics, drugs and their metabolites and food residues may be found in fingerprint residues.[6]
A friction ridge is a raised portion of the epidermis on the digits (fingers and toes), the palm of the hand or the sole of the foot, consisting of one or more connected ridge units of friction ridge skin. These are sometimes known as "epidermal ridges" which are caused by the underlying interface between the dermal papillae of the dermis and the interpapillary (rete) pegs of the epidermis. These unique features are formed at around the 15th week of fetal development and remain until after death, when decomposition begins.[7] During the development of the fetus, around the 13th week of a pregnancy, ledge-like formation is formed at the bottom of the epidermis beside the dermis. The cells along these ledges begin to rapidly proliferate. This rapid proliferation forms primary and secondary ridges. Both the primary and secondary ridges act as a template for the outer layer of the skin to form the friction ridges seen on the surface of the skin.
These epidermal ridges serve to amplify vibrations triggered, for example, when fingertips brush across an uneven surface, better transmitting the signals to sensory nerves involved in fine texture perception.[8] These ridges may also assist in gripping rough surfaces and may improve surface contact in wet conditions.[9]
Consensus within the scientific community suggests that the dermatoglyphic patterns on fingertips are hereditary.[10] The fingerprint patterns between monozygotic twins have been shown to be very similar (though not identical), whereas dizygotic twins have considerably less similarity. Significant heritability has been identified for 12 dermatoglyphic characteristics.[11] Current models of dermatoglyphic trait inheritance suggest Mendelian transmission with additional effects from either additive or dominant major genes.[12]
Whereas genes determine the general characteristics of patterns and their type, the presence of environmental factors result in the slight differentiation of each fingerprint. However, the relative influences of genetic and environmental effects on fingerprint patterns are generally unclear. One study has suggested that roughly 5% of the total variability is due to small environmental effects, although this was only performed using total ridge count as a metric. Several models of finger ridge formation mechanisms that lead to the vast diversity of fingerprints have been proposed. One model suggests that a buckling instability in the basal cell layer of the fetal epidermis is responsible for developing epidermal ridges.[13] Additionally, blood vessels and nerves may also serve a role in the formation of ridge configurations.[14] Another model indicates that changes in amniotic fluid surrounding each developing finger within the uterus cause corresponding cells on each fingerprint to grow in different microenvironments.[15] For a given individual, these various factors affect each finger differently, preventing two fingerprints from being identical while still retaining similar patterns.
It is important to note that the determination of fingerprint inheritance is made difficult by the vast diversity of phenotypes. Classification of a specific pattern is often subjective (lack of consensus on the most appropriate characteristic to measure quantitatively) which complicates analysis of dermatoglyphic patterns. Several modes of inheritance have been suggested and observed for various fingerprint patterns. Total fingerprint ridge count, a commonly used metric of fingerprint pattern size, has been suggested to have a polygenic mode of inheritance and is influenced by multiple additive genes. This hypothesis has been challenged by other research, however, which indicates that ridge counts on individual fingers are genetically independent and lack evidence to support the existence of additive genes influencing pattern formation.[16] Another mode of fingerprint pattern inheritance suggests that the arch pattern on the thumb and on other fingers are inherited as an autosomal dominant trait.[17] Further research on the arch pattern has suggested that a major gene or multifactorial inheritance is responsible for arch pattern heritability.[18] A separate model for the development of the whorl pattern indicates that a single gene or group of linked genes contributes to its inheritance.[19] Furthermore, inheritance of the whorl pattern does not appear to be symmetric in that the pattern is seemingly randomly distributed among the ten fingers of a given individual. In general, comparison of fingerprint patterns between left and right hands suggests an asymmetry in the effects of genes on fingerprint patterns, although this observation requires further analysis.[20]
In addition to proposed models of inheritance, specific genes have been implicated as factors in fingertip pattern formation (their exact mechanism of influencing patterns is still under research). Multivariate linkage analysis of finger ridge counts on individual fingers revealed linkage to chromosome 5q14.1 specifically for the ring, index, and middle fingers.[21] In mice, variants in the gene EVI1 were correlated with dermatoglyphic patterns.[22] EVI1 expression in humans does not directly influence fingerprint patterns but does affect limb and digit formation which in turn may play a role in influencing fingerprint patterns. Genome-wide association studies found single nucleotide polymorphisms within the gene ADAMTS9-AS2 on 3p14.1, which appeared to have an influence on the whorl pattern on all digits.[23] This gene encodes antisense RNA which may inhibit ADAMTS9, which is expressed in the skin. A model of how genetic variants of ADAMTS9-AS2 directly influence whorl development has not yet been proposed.
In February 2023, a study identified WNT, BMP and EDAR as signaling pathways regulating the formation of primary ridges on fingerprints, with the first two having an opposite relationship established by a Turing reaction-diffusion system.[24] [25] [26]
Before computerization, manual filing systems were used in large fingerprint repositories.[27] A fingerprint classification system groups fingerprints according to their characteristics and therefore helps in the matching of a fingerprint against a large database of fingerprints. A query fingerprint that needs to be matched can therefore be compared with a subset of fingerprints in an existing database.[4] Early classification systems were based on the general ridge patterns, including the presence or absence of circular patterns, of several or all fingers. This allowed the filing and retrieval of paper records in large collections based on friction ridge patterns alone. The most popular systems used the pattern class of each finger to form a numeric key to assist lookup in a filing system. Fingerprint classification systems included the Roscher System, the Juan Vucetich System and the Henry Classification System. The Roscher System was developed in Germany and implemented in both Germany and Japan. The Vucetich System was developed in Argentina and implemented throughout South America. The Henry Classification System was developed in India and implemented in most English-speaking countries.[27]
In the Henry Classification System, there are three basic fingerprint patterns: loop, whorl, and arch,[28] which constitute 60–65 percent, 30–35 percent, and 5 percent of all fingerprints respectively.[29] There are also more complex classification systems that break down patterns even further, into plain arches or tented arches, and into loops that may be radial or ulnar, depending on the side of the hand toward which the tail points. Ulnar loops start on the pinky-side of the finger, the side closer to the ulna, the lower arm bone. Radial loops start on the thumb-side of the finger, the side closer to the radius. Whorls may also have sub-group classifications including plain whorls, accidental whorls, double loop whorls, peacock's eye, composite, and central pocket loop whorls.
The "primary classification number" in the Henry Classification System is a fraction whose numerator and denominator are whole numbers between 1 and 32 inclusive, thus classifying each set of ten fingerprints into one of 1024 groups. (To distinguish these groups, the fraction is not reduced by dividing out any common factors.) The fraction is determined by ten indicators, one for each finger, an indicator taking the value 1 when that finger has a whorl, and 0 otherwise. These indicators can be written
Rt,Ri,Rm,Rr,Rl
Lt,Li,Lm,Lr,Ll
{16Ri+8Rr+4Lt+2Lm+1Ll+1\over16Rt+8Rm+4Rl+2Li+1Lr+1}.
For example, if only the right ring finger and the left index finger have whorls, then the set of fingerprints is classified into the "9/3" group:
{16(0)+8(1)+4(0)+2(0)+1(0)+1\over16(0)+8(0)+4(0)+2(1)+1(0)+1}={9\over3}.
Note that although 9/3 = 3/1, the "9/3" group is different from the "3/1" group, as the latter corresponds to having whorls only on the left middle finger.
Fingerprint identification, known as dactyloscopy,[30] ridgeology,[31] or hand print identification, is the process of comparing two instances of friction ridge skin impressions (see minutiae), from human fingers or toes, or even the palm of the hand or sole of the foot, to determine whether these impressions could have come from the same individual. The flexibility and the randomized formation of the friction ridges on skin means that no two finger or palm prints are ever exactly alike in every detail; even two impressions recorded immediately after each other from the same hand may be slightly different. Fingerprint identification, also referred to as individualization, involves an expert, or an expert computer system operating under threshold scoring rules, determining whether two friction ridge impressions are likely to have originated from the same finger or palm (or toe or sole).
In 2024, research using deep learning neural networks found contrary to "prevailing assumptions" that fingerprints from different fingers of the same person could be identified as belonging to that individual with 99.99% confidence. Further, features used in traditional methods were nonpredictive in such identification while ridge orientation, particularly near the center of the fingerprint center provided most information.[32]
An intentional recording of friction ridges is usually made with black printer's ink rolled across a contrasting white background, typically a white card. Friction ridges can also be recorded digitally, usually on a glass plate, using a technique called live scan. A "latent print" is the chance recording of friction ridges deposited on the surface of an object or a wall. Latent prints are invisible to the naked eye, whereas "patent prints" or "plastic prints" are viewable with the unaided eye. Latent prints are often fragmentary and require the use of chemical methods, powder, or alternative light sources in order to be made clear. Sometimes an ordinary bright flashlight will make a latent print visible.
When friction ridges come into contact with a surface that will take a print, material that is on the friction ridges such as perspiration, oil, grease, ink, or blood, will be transferred to the surface. Factors which affect the quality of friction ridge impressions are numerous. Pliability of the skin, deposition pressure, slippage, the material from which the surface is made, the roughness of the surface, and the substance deposited are just some of the various factors which can cause a latent print to appear differently from any known recording of the same friction ridges. Indeed, the conditions surrounding every instance of friction ridge deposition are unique and never duplicated. For these reasons, fingerprint examiners are required to undergo extensive training. The scientific study of fingerprints is called dermatoglyphics.
Exemplar prints, or known prints, is the name given to fingerprints deliberately collected from a subject, whether for purposes of enrollment in a system or when under arrest for a suspected criminal offense. During criminal arrests, a set of exemplar prints will normally include one print taken from each finger that has been rolled from one edge of the nail to the other, plain (or slap) impressions of each of the four fingers of each hand, and plain impressions of each thumb. Exemplar prints can be collected using live scan or by using ink on paper cards.
In forensic science, a partial fingerprint lifted from a surface is called a latent fingerprint. Moisture and grease on fingers result in latent fingerprints on surfaces such as glass. But because they are not clearly visible, their detection may require chemical development through powder dusting, the spraying of ninhydrin, iodine fuming, or soaking in silver nitrate.[33] Depending on the surface or the material on which a latent fingerprint has been found, different methods of chemical development must be used. Forensic scientists use different techniques for porous surfaces, such as paper, and nonporous surfaces, such as glass, metal or plastic.[34] Nonporous surfaces require the dusting process, where fine powder and a brush are used, followed by the application of transparent tape to lift the latent fingerprint off the surface.[34]
While the police often describe all partial fingerprints found at a crime scene as latent prints, forensic scientists call partial fingerprints that are readily visible patent prints. Chocolate, toner, paint or ink on fingers will result in patent fingerprints. Latent fingerprints impressions that are found on soft material, such as soap, cement or plaster, are called plastic prints by forensic scientists.[35]
Fingerprint image acquisition is considered to be the most critical step in an automated fingerprint authentication system, as it determines the final fingerprint image quality, which has a drastic effect on the overall system performance. There are different types of fingerprint readers on the market, but the basic idea behind each is to measure the physical difference between ridges and valleys.
All the proposed methods can be grouped into two major families: solid-state fingerprint readers and optical fingerprint readers. The procedure for capturing a fingerprint using a sensor consists of rolling or touching with the finger onto a sensing area, which according to the physical principle in use (optical, ultrasonic, capacitive, or thermalsee) captures the difference between valleys and ridges. When a finger touches or rolls onto a surface, the elastic skin deforms. The quantity and direction of the pressure applied by the user, the skin conditions and the projection of an irregular 3D object (the finger) onto a 2D flat plane introduce distortions, noise, and inconsistencies in the captured fingerprint image. These problems result in inconsistent and non-uniform irregularities in the image.[36] During each acquisition, therefore, the results of the imaging are different and uncontrollable. The representation of the same fingerprint changes every time the finger is placed on the sensor plate, increasing the complexity of any attempt to match fingerprints, impairing the system performance and consequently, limiting the widespread use of this biometric technology.
In order to overcome these problems, as of 2010, non-contact or touchless 3D fingerprint scanners have been developed. Acquiring detailed 3D information, 3D fingerprint scanners take a digital approach to the analog process of pressing or rolling the finger. By modelling the distance between neighboring points, the fingerprint can be imaged at a resolution high enough to record all the necessary detail.[37]
The human skin itself, which is a regenerating organ until death, and environmental factors such as lotions and cosmetics, pose challenges when fingerprinting a human. Following the death of a human, the skin dries and cools. Fingerprints of dead humans may be obtained during an autopsy.[38]
The collection of fingerprints off of a cadaver can be done in varying ways and depends on the condition of the skin. In the case of cadaver in the later stages of decomposition with dried skin, analysts will boil the skin to recondition/rehydrate it, allowing for moisture to flow back into the skin and resulting in detail friction ridges.[39] Another method that has been used in brushing a powder, such as baby powder over the tips of the fingers.[40] The powder will ebbed itself into the farrows of the friction ridges allowing for the lifted ridges to be seen.
In the 1930s, criminal investigators in the United States first discovered the existence of latent fingerprints on the surfaces of fabrics, most notably on the insides of gloves discarded by perpetrators.[41]
Since the late nineteenth century, fingerprint identification methods have been used by police agencies around the world to identify suspected criminals as well as the victims of crime. The basis of the traditional fingerprinting technique is simple. The skin on the palmar surface of the hands and feet forms ridges, so-called papillary ridges, in patterns that are unique to each individual and which do not change over time. Even identical twins (who share their DNA) do not have identical fingerprints. The best way to render latent fingerprints visible, so that they can be photographed, can be complex and may depend, for example, on the type of surfaces on which they have been left. It is generally necessary to use a "developer", usually a powder or chemical reagent, to produce a high degree of visual contrast between the ridge patterns and the surface on which a fingerprint has been deposited.
Developing agents depend on the presence of organic materials or inorganic salts for their effectiveness, although the water deposited may also take a key role. Fingerprints are typically formed from the aqueous-based secretions of the eccrine glands of the fingers and palms with additional material from sebaceous glands primarily from the forehead. This latter contamination results from the common human behaviors of touching the face and hair. The resulting latent fingerprints consist usually of a substantial proportion of water with small traces of amino acids and chlorides mixed with a fatty, sebaceous component which contains a number of fatty acids and triglycerides. Detection of a small proportion of reactive organic substances such as urea and amino acids is far from easy.
Fingerprints at a crime scene may be detected by simple powders, or by chemicals applied in situ. More complex techniques, usually involving chemicals, can be applied in specialist laboratories to appropriate articles removed from a crime scene. With advances in these more sophisticated techniques, some of the more advanced crime scene investigation services from around the world were, as of 2010, reporting that 50% or more of the fingerprints recovered from a crime scene had been identified as a result of laboratory-based techniques.
Although there are hundreds of reported techniques for fingerprint detection, many of these are only of academic interest and there are only around 20 really effective methods which are currently in use in the more advanced fingerprint laboratories around the world.
Some of these techniques, such as ninhydrin, diazafluorenone and vacuum metal deposition, show great sensitivity and are used operationally. Some fingerprint reagents are specific, for example ninhydrin or diazafluorenone reacting with amino acids. Others such as ethyl cyanoacrylate polymerisation, work apparently by water-based catalysis and polymer growth. Vacuum metal deposition using gold and zinc has been shown to be non-specific, but can detect fat layers as thin as one molecule.
More mundane methods, such as the application of fine powders, work by adhesion to sebaceous deposits and possibly aqueous deposits in the case of fresh fingerprints. The aqueous component of a fingerprint, while initially sometimes making up over 90% of the weight of the fingerprint, can evaporate quite quickly and may have mostly gone after 24 hours. Following work on the use of argon ion lasers for fingerprint detection,[42] a wide range of fluorescence techniques have been introduced, primarily for the enhancement of chemically developed fingerprints; the inherent fluorescence of some latent fingerprints may also be detected. Fingerprints can for example be visualized in 3D and without chemicals by the use of infrared lasers.[43]
A comprehensive manual of the operational methods of fingerprint enhancement was last published by the UK Home Office Scientific Development Branch in 2013 and is used widely around the world.[44]
A technique proposed in 2007 aims to identify an individual's ethnicity, sex, and dietary patterns.[45]
One of the main limitations of friction ridge impression evidence regarding the actual collection would be the surface environment, specifically talking about how porous the surface the impression is on.[46] With non-porous surfaces, the residues of the impression will not be absorbed into the material of the surface, but could be smudged by another surface. With porous surfaces, the residues of the impression will be absorbed into the surface. With both resulting in either an impression of no value to examiners or the destruction of the friction ridge impressions.
In order for analysts to correctly positively identify friction ridge patterns and their features depends heavily on the clarity of the impression.[47] [48] Therefore, the analysis of friction ridges is limited by clarity.
In a court context, many have argued that friction ridge identification and ridgeology should be classified as opinion evidence and not as fact, therefore should be assessed as such.[49] Many have said that friction ridge identification is only legally admissible today because during the time when it was added to the legal system, the admissibility standards were quite low.[50] There are only a limited number of studies that have been conducted to help confirm the science behind this identification process.
The application of the new scanning Kelvin probe (SKP) fingerprinting technique, which makes no physical contact with the fingerprint and does not require the use of developers, has the potential to allow fingerprints to be recorded while still leaving intact material that could subsequently be subjected to DNA analysis. A forensically usable prototype was under development at Swansea University during 2010, in research that was generating significant interest from the British Home Office and a number of different police forces across the UK, as well as internationally. The hope is that this instrument could eventually be manufactured in sufficiently large numbers to be widely used by forensic teams worldwide.[51] [52]
The secretions, skin oils and dead cells in a human fingerprint contain residues of various chemicals and their metabolites present in the body. These can be detected and used for forensic purposes. For example, the fingerprints of tobacco smokers contain traces of cotinine, a nicotine metabolite; they also contain traces of nicotine itself. Caution should be used, as its presence may be caused by mere contact of the finger with a tobacco product. By treating the fingerprint with gold nanoparticles with attached cotinine antibodies, and then subsequently with a fluorescent agent attached to cotinine antibodies, the fingerprint of a smoker becomes fluorescent; non-smokers' fingerprints stay dark. The same approach, as of 2010, is being tested for use in identifying heavy coffee drinkers, cannabis smokers, and users of various other drugs.[53] [54]
Most American law enforcement agencies use Wavelet Scalar Quantization (WSQ), a wavelet-based system for efficient storage of compressed fingerprint images at 500 pixels per inch (ppi). WSQ was developed by the FBI, the Los Alamos National Lab, and the National Institute of Standards and Technology (NIST). For fingerprints recorded at 1000 ppi spatial resolution, law enforcement (including the FBI) uses JPEG 2000 instead of WSQ.
Fingerprints collected at a crime scene, or on items of evidence from a crime, have been used in forensic science to identify suspects, victims and other persons who touched a surface. Fingerprint identification emerged as an important system within police agencies in the late 19th century, when it replaced anthropometric measurements as a more reliable method for identifying persons having a prior record, often under a false name, in a criminal record repository. Fingerprinting has served all governments worldwide during the past 100 years or so to provide identification of criminals. Fingerprints are the fundamental tool in every police agency for the identification of people with a criminal history.
The validity of forensic fingerprint evidence has been challenged by academics, judges and the media. In the United States fingerprint examiners have not developed uniform standards for the identification of an individual based on matching fingerprints. In some countries where fingerprints are also used in criminal investigations, fingerprint examiners are required to match a number of identification points before a match is accepted. In England 16 identification points are required and in France 12, to match two fingerprints and identify an individual. Point-counting methods have been challenged by some fingerprint examiners because they focus solely on the location of particular characteristics in fingerprints that are to be matched. Fingerprint examiners may also uphold the one dissimilarity doctrine, which holds that if there is one dissimilarity between two fingerprints, the fingerprints are not from the same finger. Furthermore, academics have argued that the error rate in matching fingerprints has not been adequately studied and it has even been argued that fingerprint evidence has no secure statistical foundation.[1] Research has been conducted into whether experts can objectively focus on feature information in fingerprints without being misled by extraneous information, such as context.[2]
Fingerprints can theoretically be forged and planted at crime scenes.[55]
Fingerprinting was the basis upon which the first forensic professional organization was formed, the International Association for Identification (IAI), in 1915.[56] The first professional certification program for forensic scientists was established in 1977, the IAI's Certified Latent Print Examiner program, which issued certificates to those meeting stringent criteria and had the power to revoke certification where an individual's performance warranted it.[57] Other forensic disciplines have followed suit and established their own certification programs.
Fingerprints have been found on ancient clay tablets, seals, and pottery. They have also been found on the walls of Egyptian tombs and on Minoan, Greek, and Chinese[58] pottery. In ancient China officials authenticated government documents with their fingerprints. In about 200 BC, fingerprints were used to sign written contracts in Babylon.[59] Fingerprints from 3D-scans of cuneiform tablets are extracted using the GigaMesh Software Framework.
With the advent of silk and paper in China, parties to a legal contract impressed their handprints on the document. Sometime before 851 CE, an Arab merchant in China, Abu Zayd Hasan, witnessed Chinese merchants using fingerprints to authenticate loans.
References from the age of the Babylonian king Hammurabi (reigned 1792–1750 BCE) indicate that law officials would take the fingerprints of people who had been arrested.[60] During China's Qin dynasty, records have shown that officials took hand prints and foot prints as well as fingerprints as evidence from a crime scene.[61] In 650, the Chinese historian Kia Kung-Yen remarked that fingerprints could be used as a means of authentication.[62] In his Jami al-Tawarikh (Universal History), the Iranian physician Rashid-al-Din Hamadani (1247–1318) refers to the Chinese practice of identifying people via their fingerprints, commenting: "Experience shows that no two individuals have fingers exactly alike."[63] Whether these examples indicate that ancient peoples realized that fingerprints could uniquely identify individuals has been debated, with some arguing these examples are no more meaningful than an illiterate's mark on a document or an accidental remnant akin to a potter's mark on their clay.[64]
From the late 16th century onwards, European academics attempted to include fingerprints in scientific studies. But plausible conclusions could be established only from the mid-17th century onwards. In 1686, the professor of anatomy at the University of Bologna Marcello Malpighi identified ridges, spirals and loops in fingerprints left on surfaces. In 1788, a German anatomist Johann Christoph Andreas Mayer was the first European to conclude that fingerprints were unique to each individual.[65]
In 1823, Jan Evangelista Purkyně identified nine fingerprint patterns. The nine patterns include the tented arch, the loop, and the whorl, which in modern-day forensics are considered ridge details.[66] In 1840, following the murder of Lord William Russell, a provincial doctor, Robert Blake Overton, wrote to Scotland Yard suggesting checking for fingerprints.[67] In 1853, the German anatomist Georg von Meissner (1829–1905) studied friction ridges,[68] and in 1858, Sir William James Herschel initiated fingerprinting in India. In 1877, he first instituted the use of fingerprints on contracts and deeds to prevent the repudiation of signatures in Hooghly near Kolkata[69] and he registered government pensioners' fingerprints to prevent the collection of money by relatives after a pensioner's death.[70]
In 1880, Henry Faulds, a Scottish surgeon in a Tokyo hospital, published his first paper on the usefulness of fingerprints for identification and proposed a method to record them with printing ink.[71] Henry Faulds also suggested, based on his studies, that fingerprints are unique to a human.[72] Returning to Great Britain in 1886, he offered the concept to the Metropolitan Police in London but it was dismissed at that time.[73] Up until the early 1890s, police forces in the United States and on the European continent could not reliably identify criminals to track their criminal record.[74] Francis Galton published a detailed statistical model of fingerprint analysis and identification in his 1892 book Finger Prints. He had calculated that the chance of a "false positive" (two different individuals having the same fingerprints) was about 1 in 64 billion.[75] In 1892, Juan Vucetich, an Argentine chief police officer, created the first method of recording the fingerprints of individuals on file. In that same year, Francisca Rojas was found in a house with neck injuries, while her two sons were found dead with their throats cut. Rojas accused a neighbour, but despite brutal interrogation, this neighbour would not confess to the crimes. Inspector Álvarez, a colleague of Vucetich, went to the scene and found a bloody thumb mark on a door. When it was compared with Rojas' prints, it was found to be identical with her right thumb. She then confessed to the murder of her sons.[76] This was the first known murder case to be solved using fingerprint analysis.[77]
In Kolkata, a fingerprint Bureau was established in 1897, after the Council of the Governor General approved a committee report that fingerprints should be used for the classification of criminal records. The bureau employees Azizul Haque and Hem Chandra Bose have been credited with the primary development of a fingerprint classification system eventually named after their supervisor, Sir Edward Richard Henry.[78]
The French scientist Paul-Jean Coulier developed a method to transfer latent fingerprints on surfaces to paper using iodine fuming. It allowed the London Scotland Yard to start fingerprinting individuals and identify criminals using fingerprints in 1901. Soon after, American police departments adopted the same method and fingerprint identification became a standard practice in the United States.[74] The Scheffer case of 1902 is the first case of the identification, arrest, and conviction of a murderer based upon fingerprint evidence. Alphonse Bertillon identified the thief and murderer Scheffer, who had previously been arrested and his fingerprints filed some months before, from the fingerprints found on a fractured glass showcase, after a theft in a dentist's apartment where the dentist's employee was found dead. It was able to be proved in court that the fingerprints had been made after the showcase was broken.[79]
The identification of individuals through fingerprints for law enforcement has been considered essential in the United States since the beginning of the 20th century. Body identification using fingerprints has also been valuable in the aftermath of natural disasters and anthropogenic hazards.[80] In the United States, the FBI manages a fingerprint identification system and database called the Integrated Automated Fingerprint Identification System (IAFIS), which currently holds the fingerprints and criminal records of over 51 million criminal record subjects and over 1.5 million civil (non-criminal) fingerprint records. OBIM, formerly U.S. VISIT, holds the largest repository of biometric identifiers in the U.S. government at over 260 million individual identities.[81] When it was deployed in 2004, this repository, known as the Automated Biometric Identification System (IDENT), stored biometric data in the form of two-finger records. Between 2005 and 2009, the DHS transitioned to a ten-print record standard in order to establish interoperability with IAFIS.[82]
In 1910, Edmond Locard established the first forensic lab in France.[74] Criminals may wear gloves to avoid leaving fingerprints. However, the gloves themselves can leave prints that are as unique as human fingerprints. After collecting glove prints, law enforcement can match them to gloves that they have collected as evidence or to prints collected at other crime scenes.[83] In many jurisdictions the act of wearing gloves itself while committing a crime can be prosecuted as an inchoate offense.[84]
The non-governmental organization (NGO) Privacy International in 2002 made the cautionary announcement that tens of thousands of UK school children were being fingerprinted by schools, often without the knowledge or consent of their parents.[85] That same year, the supplier Micro Librarian Systems, which uses a technology similar to that used in US prisons and the German military, estimated that 350 schools throughout Britain were using such systems to replace library cards. By 2007, it was estimated that 3,500 schools were using such systems.[86] Under the United Kingdom Data Protection Act, schools in the UK do not have to ask parental consent to allow such practices to take place. Parents opposed to fingerprinting may bring only individual complaints against schools.[87] In response to a complaint which they are continuing to pursue, in 2010, the European Commission expressed 'significant concerns' over the proportionality and necessity of the practice and the lack of judicial redress, indicating that the practice may break the European Union data protection directive.[88]
In March 2007, the UK government was considering fingerprinting all children aged 11 to 15 and adding the prints to a government database as part of a new passport and ID card scheme and disallowing opposition for privacy concerns. All fingerprints taken would be cross-checked against prints from 900,000 unsolved crimes. Shadow Home secretary David Davis called the plan "sinister". The Liberal Democrat home affairs spokesman Nick Clegg criticised "the determination to build a surveillance state behind the backs of the British people". The UK's junior education minister Lord Adonis defended the use of fingerprints by schools, to track school attendance as well as access to school meals and libraries, and reassured the House of Lords that the children's fingerprints had been taken with the consent of the parents and would be destroyed once children left the school.[89] An Early Day Motion which called on the UK Government to conduct a full and open consultation with stakeholders about the use of biometrics in schools, secured the support of 85 Members of Parliament (Early Day Motion 686).[90] Following the establishment in the United Kingdom of a Conservative and Liberal Democratic coalition government in May 2010, the UK ID card scheme was scrapped.[91]
Serious concerns about the security implications of using conventional biometric templates in schools have been raised by a number of leading IT security experts,[92] one of whom has voiced the opinion that "it is absolutely premature to begin using 'conventional biometrics' in schools".[93] The vendors of biometric systems claim that their products bring benefits to schools such as improved reading skills, decreased wait times in lunch lines and increased revenues.[94] They do not cite independent research to support this view. One education specialist wrote in 2007: "I have not been able to find a single piece of published research which suggests that the use of biometrics in schools promotes healthy eating or improves reading skills amongst children... There is absolutely no evidence for such claims".[95]
The Ottawa Police in Canada have advised parents who fear their children may be kidnapped to fingerprint their children.[96]
A very rare medical condition, adermatoglyphia, is characterized by the absence of fingerprints. Affected persons have completely smooth fingertips, palms, toes and soles, but no other medical signs or symptoms.[97] A 2011 study indicated that adermatoglyphia is caused by the improper expression of the protein SMARCAD1.[98] The condition has been called immigration delay disease by the researchers describing it, because the congenital lack of fingerprints causes delays when affected persons attempt to prove their identity while traveling. Only five families with this condition had been described as of 2011.[99]
People with Naegeli–Franceschetti–Jadassohn syndrome and dermatopathia pigmentosa reticularis, which are both forms of ectodermal dysplasia, also have no fingerprints. Both of these rare genetic syndromes produce other signs and symptoms as well, such as thin, brittle hair.The anti-cancer medication capecitabine may cause the loss of fingerprints.[100] Swelling of the fingers, such as that caused by bee stings, will in some cases cause the temporary disappearance of fingerprints, though they will return when the swelling recedes.
Since the elasticity of skin decreases with age, many senior citizens have fingerprints that are difficult to capture. The ridges get thicker; the height between the top of the ridge and the bottom of the furrow gets narrow, so there is less prominence.[101]
Fingerprints can be erased permanently and this can potentially be used by criminals to reduce their chance of conviction. Erasure can be achieved in a variety of ways including simply burning the fingertips, using acids and advanced techniques such as plastic surgery.[102] [103] [104] [105] [106] John Dillinger burned his fingers with acid, but prints taken during a previous arrest and upon death still exhibited almost complete relation to one another.[107]
Fingerprints can be captured as graphical ridge and valley patterns. Because of their uniqueness and permanence, fingerprints emerged as the most widely used biometric identifier in the 2000s. Automated fingerprint verification systems were developed to meet the needs of law enforcement and their use became more widespread in civilian applications. Despite being deployed more widely, reliable automated fingerprint verification remained a challenge and was extensively researched in the context of pattern recognition and image processing. The uniqueness of a fingerprint can be established by the overall pattern of ridges and valleys, or the logical ridge discontinuities known as minutiae. In the 2000s, minutiae features were considered the most discriminating and reliable feature of a fingerprint. Therefore, the recognition of minutiae features became the most common basis for automated fingerprint verification. The most widely used minutiae features used for automated fingerprint verification were the ridge ending and the ridge bifurcation.[108]
The three basic patterns of fingerprint ridges are the arch, loop, and whorl:
Scientists have found that family members often share the same general fingerprint patterns, leading to the belief that these patterns are inherited.[109]
Features of fingerprint ridges, called minutiae, include:[110]
See main article: Fingerprint scanner. A fingerprint sensor is an electronic device used to capture a digital image of the fingerprint pattern. The captured image is called a live scan. This live scan is digitally processed to create a biometric template (a collection of extracted features) which is stored and used for matching. Many technologies have been used including optical, capacitive, RF, thermal, piezoresistive, ultrasonic, piezoelectric, and MEMS.[111]
Since 2000, electronic fingerprint readers have been introduced as consumer electronics security applications. Fingerprint sensors could be used for login authentication and the identification of computer users. However, some less sophisticated sensors have been discovered to be vulnerable to quite simple methods of deception, such as fake fingerprints cast in gels. In 2006, fingerprint sensors gained popularity in the laptop market. Built-in sensors in laptops, such as ThinkPad, VAIO, HP Pavilion and EliteBook laptops, and others also double as motion detectors for document scrolling, like the scroll wheel.[112]
Two of the first smartphone manufacturers to integrate fingerprint recognition into their phones were Motorola with the Atrix 4G in 2011 and Apple with the iPhone 5S on September 10, 2013. One month after, HTC launched the One Max, which also included fingerprint recognition. In April 2014, Samsung released the Galaxy S5, which integrated a fingerprint sensor on the home button.[113]
Following the release of the iPhone 5S model, a group of German hackers announced on September 21, 2013, that they had bypassed Apple's new Touch ID fingerprint sensor by photographing a fingerprint from a glass surface and using that captured image as verification. The spokesman for the group stated: "We hope that this finally puts to rest the illusions people have about fingerprint biometrics. It is plain stupid to use something that you can't change and that you leave everywhere every day as a security token."[114] In September 2015, Apple included a new version of the fingerprint scanner in the iPhone home button with the iPhone 6S. The use of the Touch ID fingerprint scanner was optional and could be configured to unlock the screen or pay for mobile apps purchases.[115] Since December 2015, cheaper smartphones with fingerprint recognition have been released, such as the $100 UMI Fair. Samsung introduced fingerprint sensors to its mid-range A series smartphones in 2014.[116]
By 2017, Hewlett Packard, Asus, Huawei, Lenovo and Apple were using fingerprint readers in their laptops.[117] [118] [119] Synaptics says the SecurePad sensor is now available for OEMs to start building into their laptops.[120] In 2018, Synaptics revealed that their in-display fingerprint sensors would be featured on the new Vivo X21 UD smartphone. This was the first mass-produced fingerprint sensor to be integrated into the entire touchscreen display, rather than as a separate sensor.[121]
Matching algorithms are used to compare previously stored templates of fingerprints against candidate fingerprints for authentication purposes. In order to do this either the original image must be directly compared with the candidate image or certain features must be compared.[122]
Pre-processing enhances the quality of an image by filtering and removing extraneous noise. The minutiae-based algorithm is only effective with 8-bit gray scale fingerprint images. One reason for this is that an 8-bit gray fingerprint image is a fundamental base when converting the image to a 1-bit image with value 1 for ridges and value 0 for furrows. This process allows for enhanced edge detection so the fingerprint is revealed in high contrast, with the ridges highlighted in black and the furrows in white. To further optimize the input image's quality, two more steps are required: minutiae extraction and false minutiae removal. The minutiae extraction is carried out by applying a ridge-thinning algorithm that removes redundant pixels of ridges. As a result, the thinned ridges of the fingerprint image are marked with a unique ID to facilitate the conduction of further operations. After the minutiae extraction, the false minutiae removal is carried out. The lack of the amount of ink and the cross link among the ridges could cause false minutiae that led to inaccuracy in fingerprint recognition process.
Pattern based algorithms compare the basic fingerprint patterns (arch, whorl, and loop) between a previously stored template and a candidate fingerprint. This requires that the images can be aligned in the same orientation. To do this, the algorithm finds a central point in the fingerprint image and centers on that. In a pattern-based algorithm, the template contains the type, size, and orientation of patterns within the aligned fingerprint image. The candidate fingerprint image is graphically compared with the template to determine the degree to which they match.[123]
Some other animals have evolved their own unique prints, especially those whose lifestyle involves climbing or grasping wet objects; these include many primates, such as gorillas and chimpanzees, Australian koalas, and aquatic mammal species such as the North American fisher.[124] According to one study, even with an electron microscope, it can be quite difficult to distinguish between the fingerprints of a koala and a human.[125]
Mark Twain's memoir Life on the Mississippi (1883), notable mainly for its account of the author's time on the river, also recounts parts of his later life and includes tall tales and stories allegedly told to him. Among them is an involved, melodramatic account of a murder in which the killer is identified by a thumbprint.[126] Twain's novel Pudd'nhead Wilson, published in 1893, includes a courtroom drama that turns on fingerprint identification.
The use of fingerprints in crime fiction has, of course, kept pace with its use in real-life detection. Sir Arthur Conan Doyle wrote a short story about his celebrated sleuth Sherlock Holmes which features a fingerprint: "The Norwood Builder" is a 1903 short story set in 1894 and involves the discovery of a bloody fingerprint which helps Holmes to expose the real criminal and free his client.
The British detective writer R. Austin Freeman's first Thorndyke novel The Red Thumb-Mark was published in 1907 and features a bloody fingerprint left on a piece of paper together with a parcel of diamonds inside a safe-box. These become the center of a medico-legal investigation led by Dr. Thorndyke, who defends the accused whose fingerprint matches that on the paper, after the diamonds are stolen.
In the television series Bonanza (1959–1973), the Chinese character Hop Sing uses his knowledge of fingerprints to free Little Joe from a murder charge.
The 1997 movie Men in Black required Agent J to remove his ten fingerprints by putting his hands on a metal ball, an action deemed necessary by the MIB agency to remove the identity of its agents.
In the 2009 science fiction movie Cold Souls, a mule who smuggles souls wears latex fingerprints to frustrate airport security terminals. She can change her identity by simply changing her wig and latex fingerprints.