Americans have long accepted the presence of hidden security cameras that monitor banks, airports, office lobbies, and convenience stores. But over the past year, law enforcement agencies have sparked new privacy concerns by quietly linking surveillance cameras to computers, using software to scan the faces of ordinary citizens and instantly identify those with a criminal record.
In January, federal agents at the presidential inauguration wore tiny cameras designed to compare the facial features of onlookers with computerized images of suspected terrorists. That same month, police in Tampa, Florida, tested “facial recognition technology” at the Super Bowl, scanning 75,000 fans and running the images through a database of digitized mug shots. (The system made 19 matches, but police did not stop any of the suspects to confirm their identities.) And in June, Tampa police installed software in 36 cameras in the Ybor City entertainment district to routinely monitor the faces of pedestrians for wanted criminals.
The technology used in Ybor City, called FaceIt, was designed by Visionics Corp. of New Jersey, a leading developer of face-recognition systems. The software breaks faces into 80 distinct “landmarks,” which can then be compared to features in stored images almost instantaneously. Visionics has received $2 million from the Defense Department to adapt the idea for military uses, and the company says that a growing number of law enforcement agencies have expressed interest in the technology, especially in the wake of the September 11 terrorist attacks. West Virginia and Illinois already use versions of such software to confirm the identity of applicants for driver’s licenses and social services. As the systems spread, police are digitizing more mug shots to create larger databases-and the FBI has begun digitizing 40 million criminal records at its National Crime Information Center.
Armed with cameras linked to every mug shot in the nation, law enforcement officers would be more like Robocops, capable of recognizing anyone in a police database. But the emerging technology has raised myriad privacy concerns. Civil rights advocates question whether people who have been arrested but not convicted will be included in the databases, and who will have access to stored surveillance images. What’s more, government tests show that bad lighting or camera angles can produce false matches. “Police harassment of innocent people is a real possibility,” says Eric Rubin, an organizer with the Tampa Bay Action Group, a coalition that has staged masked protests against the technology.
So far, police in Tampa have made no arrests based on the face-scanning software. But officials in the East London borough of Newham, where 250 cameras installed by Visionics have scanned pedestrians for the past three years, insist that law-abiding citizens have nothing to fear. The technology, says town official Bob Lack, is not Big Brother, but more like “a friendly uncle and aunt watching over you.”