How to Beat Facial-Recognition Software

Over the last decade, computers have become better at seeing faces. Software can tell if a camera has a face in its frame of vision, and law enforcement has been testing facial-recognition programs that can supposedly pick out suspects in a crowd. That's prompted an arms race between the people who build facial-recognition systems — and those seeking ways to defeat them.

Facial-recognition software is becoming a bigger issue for privacy advocates as well. Surveillance cameras are already ubiquitous in the U.K., are showing up in more places in the U.S. and may increasingly be connected to facial-recognition systems.

"I went to a Kinko's a while ago," said Alex Kilpatrick, chief technology officer and co-founder of Tactical Information Systems, a company in Austin, Texas, that sells facial-recognition software to law enforcement and the military. "I saw three cameras just while I was standing in line. You see them in all kinds of places now."

The American Civil Liberties Union (ACLU) has said it is deeply concerned with the way facial-recognition systems are used. Police use such systems to flag criminals in public places, the ACLU says, but it argues that the Transportation Security Administration's (TSA) use of the technology in Boston's Logan Airport and in T.F. Green Airport near Providence, R.I., doesn't seem to have helped catch any criminals or terrorists.

Beating the machine

Kilpatrick gives presentations about the capabilities of facial-recognition systems, but is concerned about privacy as well. To him, it's every citizen's duty to throw the occasional monkey wrench into what he sees as Big Brother-like surveillance systems. So Kilpatrick started investigating how facial-recognition systems can be fooled.

It turns out that in order to do it, you have to know how machines think.

"People don't appreciate that humans are really good at certain types of face recognition," Kilpatrick said. "You can compress a picture a lot, and a person will recognize it — but a computer won't."

On the other hand, Kilpatrick said, people are not so good at identifying someone from an ID picture. That's why the TSA officers at the airport often have to look at you and your ID twice.

There is also a big difference between determining that what's in front of you is a face — finding a face —  and linking it to a particular person, or recognizing one. For both humans and machines, the first is almost trivial, but the second is much harder.

Kilpatrick did a set of experiments with a face-matching program called Neurotechnology, which used an algorithm called Eigenfaces. He found that in some cases, sunglasses that cover a big part of the eye regions and cheekbones — such as the ones Audrey Hepburn wore in "Breakfast at Tiffany's" — would fool the software enough that it couldn't match the image to a reference picture of the same person.

On the other hand, a baseball cap or shadows falling on the face would not fool the software. A critical region, as it turns out, is the space around the bridge of the nose and between the eyes. So San Francisco Giants relief pitcher Brian Wilson's signature bushy black beard, for example, wouldn't make much difference. But hairstyles that cover half the face, or asymmetrical makeup, will.

But that's the rub: Different face-finding and recognition programs rely on different characteristics to pick out features that identify the picture in front of them as a face. Matching that face to another picture uses yet another kind of algorithm. Even the most sophisticated computer programs can still be fooled — but you have to know what each program is looking for.

For example, software that focuses on the distance between the nose and ear won't even notice if you obscure the central part of the face. In other cases, software relies on the deviation from an idealized face, or from an "average" face generated from several people's images, to find a face in a picture.

Matching one face to another — recognition — might use several points and check those against a reference image. But even that, Kilpatrick notes, isn't as easy as it is often portrayed in Hollywood movies.

Wham, bam, thank you glam

Web designer Adam Harvey approached the problem of fooling Big Brother as something of an art project while he was studying at New York University. He made it part of his thesis project, and came up with something called CV Dazzle.It's named after OpenCV, a form of open-source facial-recognition software, and the "dazzle ships" of World War I and World War II, which used abstract, high-contrast "dazzle" paint schemes to foil enemy range-finders.

"There's all this money for the military and corporate applications," Harvey said. "I wanted to do something to balance that."

Harvey's plan was to not only find out what would fool facial-recognition software into not even finding a face, but also to explore norms about what's socially acceptable on a face. He said many of the designs that successfully fooled the software were pretty outré, with dramatic, high-contrast glam-rock makeup and artsy-looking hair.

"People have mentioned David Bowie and Adam Ant," Harvey said.

Harvey found that symmetry seems to be what face-finding software looks for. In one case, the model's face was not obscured, but the bridge of the nose was, and that confused the symmetry-seeking part of the program. Covering half the face seemed to work as well.

With the help of some artistically inclined friends, he came up with a set of hairstyles and makeup patterns that made the software unable to identify what it saw as a face.

For the CV Dazzle project, Harvey mainly used OpenCV. But now he's working on an application that will be able to determine what any facial-recognition algorithm, even one never before seen by the user, is looking for.

Harvey's new software presents a baseline photo over and over again to a face-finding program, with small alterations to the photo in each instance. Sometimes the program being tested will see a face, and sometimes it won't, but eventually Harvey's software will be able to tell you what the face-finding software is zeroing in on.

And once you know that, you can fool it. The advantage to this method, time-consuming as it may be, is that it doesn't require investigating the tested software's code.

Face in the crowd

However, Kilpatrick and Harvey both investigate facial recognition in controlled settings. What happens when such research is taken out into the real world?

Alessandro Acquisti, associate professor of information technology and public policy at Carnegie Mellon University in Pittsburgh, recently studied the issue of facial recognition when applied to social-networking websites, as well as how it applied to the difference between the real and online worlds.

Acquisti found that it wasn't all that hard to match public photos on an online-dating site, where many members used pseudonyms, to public photos on Facebook, where people have to use their real names. Nor was it difficult for Acquisti to match the faces of people walking across the Carnegie Mellon campus, which he shot with a simple webcam, to their Facebook profiles.

That was also a relatively controlled situation, because the number of possible faces on campus was limited, and the computing power necessary to recognize random faces is still quite large.

"Computer face recognition cannot yet allow the identification of every person all the time everywhere: even using cloud services, it still would take too much time for real-time application, and the false positives would be a huge obstacle," Acquisti told SecurityNewsDaily via email. "But 10 or 15 years from now? I am not so sure."

Acquisti added that it isn't likely that people will be wearing masks any time soon, or that social norms will allow for people to wear masks, or even some of Harvey's weirder makeup designs, in public. (Acquisti did note that a baseball cap adorned with glowing LEDs will sometimes confuse face-finding algorithms).

Ralph Gross, who worked on the facial-recognition studies with Acquisti, noted that false positives are indeed a problem for most facial-recognition software. But Gross added that the chance of identifying an unknown subject against a database of even 1.6 million criminal records is 92 percent using the best algorithms available. That percentage drops as the number of people goes up, but it is still quite good.

Poor success rate?

Yet the kinds of deployments Kilpatrick, Harvey and Acquisti have experimented with aren't how law enforcement and the military envision using facial-recognition programs. That's why the ACLU is worried.

At the Super Bowl XXXV, played in 2001 in Tampa, Fla., facial-recognition software connected to security cameras in Raymond James Stadium flagged 19 people. None were more than petty criminals, and the major offense they had committed was ticket scalping. (Following the Sept. 11, 2001, terrorist attacks,   Super Bowl security was taken over by the Secret Service, which hasn't disclosed its methods.)

Since then, the Pinellas County Sheriff's Department has deployed a newer system, which is used to match arrestees up with a database of mug shots. Even so, a human being has to make the final call on whether there's a match.

The facial-recognition software that's been deployed at airports by the TSA over the past decade has failed to catch any terrorists. And the ACLU says it's far from clear how the TSA compiles its database of suspect faces —if digital cameras simply take pictures on public streets, then potentially everyone could end up in the database. 

Acquisti said that while many people may want to fool facial-recognition software, it may one day be impossible to do so.

"The more researchers come up with techniques to hide or mask faces to avoid computer face recognition, the more other researchers will come up with techniques able to bypass those protections," he said. "The conditions under which a human face will not be recognizable by a computer will be the conditions under which also humans cannot recognize each other."

Image provided by Adam Harvey/ahprojects.com/DIS Magazine

SecurityNewsDaily