A story of failed biometrics at a gym

Creative Commons License photo credit: kevindooley

From Jake Vinson’s “Cracking your Fingers” (The Daily WTF: 28 April 2009):

A few days later, Ross stood proudly in the reception area, hands on his hips. A high-tech fingerprint scanner sat at the reception area near the turnstile and register, as the same scanner would be used for each, though the register system wasn’t quite ready for rollout yet. Another scanner sat on the opposite side of the turnstile, for gym members to sign out. … The receptionist looked almost as pleased as Ross that morning as well, excited that this meant they were working toward a system that necessitated less manual member ID lookups.

After signing a few people up, the new system was going swimmingly. Some users declined to use the new system, instead walking to the far side of the counter to use the old touchscreen system. Then Johnny tried to leave after his workout.

… He scanned his finger on his way out, but the turnstile wouldn’t budge.

“Uh, just a second,” the receptionist furiously typed and clicked, while Johnny removed one of his earbuds out and stared. “I’ll just have to manually override it…” but it was useless. There was no manual override option. Somehow, it was never considered that the scanner would malfunction. After several seconds of searching and having Johnny try to scan his finger again, the receptionist instructed him just to jump over the turnstile.

It was later discovered that the system required a “sign in” and a “sign out,” and if a member was recognized as someone else when attempting to sign out, the system rejected the input, and the turnstile remained locked in position. This was not good.

The scene repeated itself several times that day. Worse, the fingerprint scanner at the exit was getting kind of disgusting. Dozens of sweaty fingerprints required the scanner to be cleaned hourly, and even after it was freshly cleaned, it sometimes still couldn’t read fingerprints right. The latticed patterns on the barbell grips would leave indented patterns temporarily on the members’ fingers, there could be small cuts or folds on fingertips just from carrying weights or scrapes on the concrete coming out of the pool, fingers were wrinkly after a long swim, or sometimes the system just misidentified the person for no apparent reason.

Fingerprint Scanning

In much the same way that it’s not a good idea to store passwords in plaintext, it’s not a good idea to store raw fingerprint data. Instead, it should be hashed, so that the same input will consistently give the same output, but said output can’t be used to determine what the input was. In biometry, there are many complex algorithms that can analyze a fingerprint via several points on the finger. This system was set up to record seven points.

After a few hours of rollout, though, it became clear that the real world doesn’t conform to how it should’ve worked in theory. There were simply too many variables, too many activities in the gym that could cause fingerprints to become altered. As such, the installers did what they thought was the reasonable thing to do – reduce the precision from seven points down to something substantially lower.

The updated system was in place for a few days, and it seemed to be working better; no more people being held up trying to leave.


… [The monitor] showed Ray as coming in several times that week, often twice on the same day, just hours apart. For each day listed, Ray had only come the later of the two times.

Reducing the precision of the fingerprint scanning resulted in the system identifying two people as one person. Reviewing the log, they saw that some regulars weren’t showing up in the system, and many members had two or three people being identified by the scanner as them.

Give CLEAR your info, watch CLEAR lose your info

From “Missing SFO Laptop With Sensitive Data Found” (CBS5: 5 August 2008):

The company that runs a fast-pass security prescreening program at San Francisco International Airport said Tuesday that it found a laptop containing the personal information of 33,000 people more than a week after it apparently went missing.

The Transportation Security Administration announced late Monday that it had suspended new enrollments to the program, known as Clear, after the unencrypted computer was reported stolen at SFO.

The laptop was found Tuesday morning in the same company office where it supposedly had gone missing on July 26, said spokeswoman Allison Beer.

“It was not in an obvious location,” said Beer, who said an investigation was under way to determine whether the computer was actually stolen or had just been misplaced.

The laptop contained personal information on applicants to the program, including names, address and birth dates, and in some cases driver’s license, passport or green card numbers, the company said.

The laptop did not contain Social Security numbers, credit card numbers or fingerprint or iris images used to verify identities at the checkpoints, Beer said.

In a statement, the company said the information on the laptop, which was originally reported stolen from its locked office, “is secured by two levels of password protection.” Beer called the fact that the personal information itself was not encrypted “a mistake” that the company would fix.

Biometric photo watermarking using your iris

From Eric’s “Canon’s Iris Registration Mode – Biological Copyright Metadata” (Photography Bay: 9 February 2008):

A recent Canon patent application (Pub. No.: US 2008/0025574 A1) reveals the next step in digital watermarking – Iris Registration.

The short and sweet of it?

1. Turn the Mode dial to “REG”
2. Choose between “REG 1″ through “REG 5″ (for up to 5 registered users)
3. Put eye to viewfinder
4. Look at display of center distance measurement point
5. Press the shutter button
6. Iris image captured
7. Go shoot

Additional embedded info can be added later. All metadata will be added to images after you’re finished shooting in a collective manner and not for each image. The purpose of the collective tagging, if you will, is to refrain from hampering the camera’s speed (frames per second) while shooting.

Court acceptance of forensic & biometric evidence

From Brendan I. Koerner’s “Under the Microscope” (Legal Affairs: July/August 2002):

The mantra of forensic evidence examination is “ACE-V.” The acronym stands for Analysis, Comparison, Evaluation, and Verification, which forensic scientists compare with the step-by-step method drilled into countless chemistry students. “Instead of hypothesis, data collection, conclusion, we have ACE-V,” says Elaine Pagliaro, an expert at the Connecticut lab who specializes in biochemical analysis. “It’s essentially the same process. It’s just that it grew out of people who didn’t come from a background in the scientific method.” …

Yet for most of the 20th century, courts seldom set limits on what experts could say to juries. The 1923 case Frye v. United States mandated that expert witnesses could discuss any technique that had “gained general acceptance in the particular field in which it belongs.” Courts treated forensic science as if it were as well-founded as biology or physics. …

In 1993, the Supreme Court set a new standard for evidence that took into account the accelerated pace of scientific progress. In a case called Daubert v. Merrell Dow Pharmaceuticals, the plaintiffs wanted to show the jury some novel epidemiological studies to bolster their claim that Merrell Dow’s anti-nausea drug Bendectin caused birth defects. The trial judge didn’t let them. The plaintiff’s evidence, he reasoned, was simply too futuristic to have gained general acceptance.

When the case got to the Supreme Court, the justices seized the opportunity to revolutionize the judiciary’s role in supervising expert testimony. Writing for a unanimous court, Justice Harry Blackmun instructed judges to “ensure that any and all scientific testimony or evidence admitted is not only relevant, but reliable.” Daubert turned judges into “gatekeepers” responsible for discerning good science from junk before an expert takes the stand. Blackmun suggested that good science must be testable, subject to peer review, and feature a “known or potential rate of error.” …

There are a few exceptions, though. In 1999, Judge Nancy Gertner of the Federal District Court in Massachusetts set limits on the kinds of conclusions a handwriting expert could draw before a jury in United States v. Hines. The expert could point out similarities between the defendant’s handwriting and the writing on a stick-up note, the judge said, but she could not “make any ultimate conclusions on the actual authorship.” The judge questioned “the validity of the field” of handwriting analysis, noting that “one’s handwriting is not at all unique in the sense that it remains the same over time, or unique[ly] separates one individual from another.”

Early this year, Judge Pollak stunned the legal world by similarly reining in fingerprint experts in the murder-for-hire case United States v. Plaza. Pollak was disturbed by a proficiency test finding that 26 percent of the crime labs surveyed in different states did not correctly identify a set of latent prints on the first try. “Even 100 years of ‘adversarial’ testing in court cannot substitute for scientific testing,” he said. He ruled that the experts could show the jury similarities between the defendants’ prints and latent prints found at the crime scenes, but could not say the prints matched. …

… the University of West Virginia recently offered the nation’s first-ever four-year degree in biometrics …

Problems with fingerprints for authentication

From lokedhs’ “There is much truth in what you say”:

The problem with fingerprints is that it’s inherently a very insecure way of authentication for two reasons:

Firstly, you can’t change it if it leaks out. A password or a credit card number can be easily changed and the damage minimised in case of an information leak. Doing this with a fingerprint is much harder.

Secondly, the fingerprint is very hard to keep secret. Your body has this annoying ability to leave copies of your identification token all over the place, very easy for anyone to pick up.

OmniPerception = facial recognition + smart card

From Technology Review‘s’ “Face Forward“:

To get around these problems, OmniPerception, a spinoff from the University of Surrey in England, has combined its facial-recognition technology with a smart-card system. This could make face recognition more robust and better suited to applications such as passport authentication and building access control, which, if they use biometrics at all, rely mainly on fingerprint verification, says David McIntosh, the company’s CEO. With OmniPerception’s technology, an image of a person’s face is verified against a “facial PIN” carried on the card, eliminating the need to search a central database and making the system less intimidating to privacy-conscious users. …

OmniPerception’s technology creates a PIN about 2,500 digits long from its analysis of the most distinctive features of a person’s face. The number is embedded in a smart card-such as those, say, that grant access to a building-and used to verify that the card belongs to the person presenting it. A user would place his or her card in or near a reader and face a camera, which would take a photo and feed it to the card. The card would then compare the PIN it carried to information it derived from the new photo and either accept or reject the person as the rightful owner of the card. The technology could also be used to ensure passport or driver’s license authenticity and to secure ATM or Internet banking transactions, says McIntosh.

Face recognition software as an example of “function creep”

From Technology Review‘s’ “Creepy Functions“:

Consider one example of function creep. The Electoral Commission of Uganda has retained Viisage Technology to implement a face recognition system capable of enrolling 10 million voters in 60 days. The goal is to reduce voter registration fraud. But Woodward notes that the system might also be put to work fingering political opponents of the regime. And Uganda probably isn’t the first country that springs to mind when someone says “due process” or “civil rights.”

From Technology Review‘s’ “Big Brother Logs On“:

Take the fact that the faces of a large portion of the driving population are becoming digitized by motor vehicles agencies and placed into databases, says Steinhardt. It isn’t much of a stretch to extend the system to a Big Brother-like nationwide identification and tracking network. Or consider that the Electoral Commission of Uganda has retained Viisage Technology to implement a “turnkey face recognition system” capable of enrolling 10 million voter registrants within 60 days. By generating a database containing the faceprint of every one of the country’s registered voters-and combining it with algorithms able to scour all 10 million images within six seconds to find a match-the commission hopes to reduce voter registration fraud. But once such a database is compiled, notes John Woodward, a former CIA operations officer who managed spies in several Asian countries and who’s now an analyst with the Rand Corporation, it could be employed for tracking and apprehending known or suspected political foes. Woodward calls that “function creep.”

Painter of kitsch … and security

From "Art for Everybody" in the 15 October 2001 issue of The New Yorker, an article about the immensely popular, incredibly kitschy painter Thomas Kinkaid:

… ten million people own some product featuring his name, and most editions are signed with ink containing DNA from his hair or blood, to prevent fakes.