Biometric Myths: Six Of The Best
by Russ Davis - CEO of ISL Biometrics - Tuesday, 13 July 2004.
The Egyptians were clearly ahead of their time, as very little development in the field of biometrics occurred for around four thousand years. It was only in the late 1800s that people started to develop systems that used the fingerprint and other bodily characteristics in order to identify individuals. In 1880, for example, Henry Faulds, a Scottish doctor living in Japan, published his thoughts on the variety and uniqueness of fingerprints, suggesting that they could be used for the identification of criminals. Meanwhile, in 1900, the important Galton-Henry system of classifying fingerprints was published.

Other than a few isolated pieces of research into the uniqueness of the retina (which was finally turned into a workable product in 1985), the biometric industry remained fairly static until the 1960s, when the Miller brothers in New Jersey, USA, launched a device that automatically measured the length of peopleís fingers. Speaker verification and signature verification were also developed in the 1960s and 70s.

Interest from the US armed forces and intelligence agencies then emerged, but it wasnít until the turn of the century, and in particular until after 9/11, that the awareness of biometrics broke out of specialized industry circles to reach the fever pitch levels seen today.


Harnessing artificial intelligence to build an army of virtual analysts

PatternEx, a startup that gathered a team of AI researcher from MIT CSAIL as well as security and distributed systems experts, is poised to shake up things in the user and entity behavior analytics market.

Weekly newsletter

Reading our newsletter every Monday will keep you up-to-date with security news.

Daily digest

Receive a daily digest of the latest security news.

Thu, Feb 4th