Biometric Myths: Six Of The Best
by Russ Davis - CEO of ISL Biometrics - Tuesday, 13 July 2004.
The Egyptians were clearly ahead of their time, as very little development in the field of biometrics occurred for around four thousand years. It was only in the late 1800s that people started to develop systems that used the fingerprint and other bodily characteristics in order to identify individuals. In 1880, for example, Henry Faulds, a Scottish doctor living in Japan, published his thoughts on the variety and uniqueness of fingerprints, suggesting that they could be used for the identification of criminals. Meanwhile, in 1900, the important Galton-Henry system of classifying fingerprints was published.

Other than a few isolated pieces of research into the uniqueness of the retina (which was finally turned into a workable product in 1985), the biometric industry remained fairly static until the 1960s, when the Miller brothers in New Jersey, USA, launched a device that automatically measured the length of people’s fingers. Speaker verification and signature verification were also developed in the 1960s and 70s.

Interest from the US armed forces and intelligence agencies then emerged, but it wasn’t until the turn of the century, and in particular until after 9/11, that the awareness of biometrics broke out of specialized industry circles to reach the fever pitch levels seen today.


The Internet of Things is unavoidable, securing it should be a priority

The Internet of Things (IoT) started like any other buzzword: poorly defined, used too often, and generally misunderstood. However, it stood the test of time and is now increasingly becoming part of everyday language, even with those outside the IT world.

Weekly newsletter

Reading our newsletter every Monday will keep you up-to-date with security news.

Daily digest

Receive a daily digest of the latest security news.

Mon, Jul 27th