X

Police are using flawed data in facial recognition searches, study finds

When the faces aren’t quite there, police have resorted to using celebrity doppelgangers, artist sketches and computer-generated images.

Alfred Ng Senior Reporter / CNET News
Alfred Ng was a senior reporter for CNET News. He was raised in Brooklyn and previously worked on the New York Daily News's social media and breaking news teams.
Alfred Ng
4 min read
gigo-figure-2

A facial recognition image matched with an artist sketch

Georgetown Law Center on Privacy & Technology

Police across the country are making facial recognition searches even when there's barely anything to match it with.

A study from the Georgetown Law Center on Privacy and Technology released Thursday looked at how police are using flawed data to run facial recognition searches, despite years of studies showing these matches aren't reliable.

That includes using artist sketches, editing images to add eyes and lips, and searching for doppelgangers.

"You do not need to be an expert in artificial intelligence to understand that if you search for another person's face, that is not a suspect, there will be issues with the accuracy," said Alvaro Bedoya, the founding director of the Center of Privacy and Technology. 

Civil rights and privacy advocates have warned against government agencies and law enforcement using facial recognition, because there aren't any significant limits to how the technology can be used. On Tuesday, though, San Francisco became the first city to ban police use of facial recognition, and other cities are looking to do the same.  

Studies have found issues with accuracy and bias in facial recognition, and critics argue the technology poses a threat to privacy in public spaces. The study released Thursday turned up more issues with how police are using facial recognition.

When images caught on surveillance cameras are too blurry or don't show enough of a person's face, the New York Police Department has used pictures of celebrities who look like the suspect to make matches with its facial recognition program, the researchers found.

In April 2017, for instance, the NYPD used a photo of actor Woody Harrelson in its facial recognition search to find a suspect and make an arrest. The man was suspected of stealing a beer from a CVS, according to the report. In another case, it used a photo of a New York Knicks player to search for a man wanted for assault in Brooklyn, the researchers found.

A wanted poster from the NYPD and a photo of actor Woody Harrelson

When the NYPD couldn't use an image from surveillance footage, they used a photo of actor Woody Harrelson.

Georgetown Law Center on Privacy & Technology

The department says it stands by its practice.

"The NYPD has been deliberate and responsible in its use of facial recognition technology," NYPD spokeswoman detective Denise Moroney said in a statement. "We compare images from crime scenes to arrest photos in law enforcement records. We do not engage in mass or random collection of facial records from NYPD camera systems, the internet, or social media."

Records showed that the NYPD made more than 2,800 arrests from facial recognition in the first five and a half years it was in use.

When there were no clear images available, the NYPD, as well as police in about 15 states, were allowed to use sketches instead. That includes police in Maryland, Virginia, Arizona, Florida and Oregon.

Watch this: How San Francisco's ban could impact facial recognition tech

In Washington County, Oregon, which uses Amazon's Rekognition system, a presentation from a case study showed the sheriff's office using police sketches to make matches.

These police departments are running these searches despite multiple studies pointing out that sketches don't return accurate results for facial recognition. The National Institute of Standards and Technology found that sketches had a very high error rate, noting that "sketch searches mostly fail."

In other cases, the Georgetown Law Center found that police departments will generate new faces from photos where features are limited. In one case, the NYPD edited a closed mouth from an image it found on Google onto a suspect so it could better match mugshot images. Police have done the same for eyes.

"This is the wild west," Bedoya said. "Copying and pasting a different person's features and putting that on a suspect is unexplored territory."  

ACLU senior legislative counsel Neema Singh Guliani said, "Legislatures must stop the rights violations that are already resulting from government use of this technology. At the same time, companies like Amazon must take responsibility for irresponsibly selling and marketing this dangerous technology for surveillance purposes without regard for the consequences."

The image uploaded to the facial recognition search could be a mostly fabricated face, researchers found.

gigo-figure-4

The researchers also found that police would edit photos to better match its facial recognition search.

Georgetown Law Center on Privacy & Technology

"These techniques amount to the fabrication of facial identity points: at best an attempt to create information that isn't there in the first place and at worst introducing evidence that matches someone other than the person being searched for," the study said.

Police have said that facial recognition isn't intended to be conclusive evidence, and only serves as an investigative lead, but researchers found cases where there wasn't much effort beyond using the technology.

In one case, after making the facial recognition match, an officer sent the image to a witness in a text, writing, "Is this the guy?" That was all the confirmation the NYPD needed to make the arrest, the researchers said.

"Facial recognition is merely a lead; it is not a positive identification and it is not probable cause to arrest.  No one has ever been arrested on the basis of a facial recognition match alone," Moroney said.

The NYPD noted that its facial recognition program was used to find and arrest a man who threw urine at subway conductors, and another suspect who allegedly pushed a subway passenger on the tracks. The police department also said its facial recognition has led to arrests tied to homicides, rapes and robberies.

"The NYPD constantly reassesses our existing procedures and in line with that are in the process of reviewing our existent facial recognition protocols," Moroney said.

The department didn't comment on the quality of the data it uses for its facial recognition matches. 
Originally published May 16, 7:22 a.m. PT.
Updates, 8:33 a.m. and 9:30 a.m. PT: To add comments from the Georgetown Law Center on Privacy & Technology and the ACLU.