The Tsarnaev brothers, like anyone in a crowd of strangers, might have expected to be anonymous. But when the FBI released blurry, off-angle images of the two suspects in the Boston Marathon bombings, researchers with Carnegie Mellon University’s CyLab Biometrics Center began trying to bring them into focus.
In a real-time experiment, the scientists digitally mapped the face of “Suspect 2,” turned it toward the camera and enhanced it so it could be matched against a database. The researchers did not know how well they had done until authorities identified the suspect as Dzhokhar Tsarnaev, the younger, surviving brother and a student at University of Massachusetts Dartmouth.
“I was like, ‘Holy shish kabobs!’ ” Marios Savvides, director of the CMU Cylab, told the Tribune-Review. “It’s not exactly him, but it’s also not a random face. It does fit him.”
The technology, to be sure, remains in its infancy. Yet cyber experts believe it’s only a matter of years — and research dollars — until computers can identify almost anyone instantly. Computers then could use electronic data to immediately construct an intimate dossier about the person, much of it from available information online that many people put out there themselves.
From seeing just the image of a face, computers will find its match in a database of millions of driver’s license portraits and photos on social media sites. From there, the computer will link to the person’s name and details such as their Social Security number, preferences, hobbies, family and friends.
Adding that capability to drones that can fly into spaces where planes cannot — machines that can track a person moving about and can stay aloft for days — means that people will give up privacy as well as the concept of anonymity.
“We are accustomed to living in a society where our movements are not tracked from place to place, and it’s a big shift to have that happen,” said Jennifer Lynch, staff attorney with the Electronic Frontier Foundation, a San Francisco-based nonprofit that works to protect digital rights and privacy.
“There’s so much data about us in different places that it’s absolutely impossible to keep track of it or to delete it. … Adding facial recognition capabilities to that will destroy anonymity and will create a pretty big chilling effect on how we feel about moving about in society and the choices we make in our lives.”
Inside the CyLab at Carnegie Mellon, an off-the-shelf drone with four rotors spins about the room. As it does, a camera looks into each face and sends images to a computer that dissects them into distinct markers that can be matched against a database.
Students working with Savvides are figuring out how to break up appearance into landmarks as unique as a fingerprint and to build a 3-D image from a single picture so it can be matched from different angles. “The things we can do are endless,” said Savvides. “We’re basically decoding the face.”
For now, the database holds only the images of lab workers and visitors who agree to participate. Savvides said he can envision a day when images collected by tiny cameras embedded in police cruisers and attached to officers’ uniforms are matched against a database of wanted criminals. As soon as a driver looks into a rear-view mirror to see an officer pulling up, the person’s face could be matched.
That technology does not exist, but the students have built a camera that collects facial identifiers from as far as 60 feet away.