There’s a good chance your face is in a criminal investigative database somewhere, and the FBI has made it clear that it wants access to every database, a prospect that deeply troubles privacy advocates.
More than 400 million pictures of Americans’ faces are archived in local, state and federal law enforcement facial recognition networks, the federal Government Accountability Office reported last year.
Those pictures include the faces of about half of all U.S. adults, Georgetown University Law School’s Center on Privacy & Technology estimates.
The networks are largely unregulated and subject to ethnic and gender bias, according to experts, including a photo technologist for the FBI itself. The databases are culled from police mugshots, driver’s licenses, passports, visas, security video and other sources — taking in millions of Americans who aren’t even suspected of a crime.
“The FBI, in particular, and others are doing everything [they] can to build out facial recognition with the goal, essentially, of having everybody’s face in their database,” Rep. Jason Chaffetz, R-Utah, chairman of the House Committee on Oversight and Government Regulation, said at a Washington, D.C., conference last month.
Which is why Chaffetz’s committee is holding a hearing Wednesday morning to investigate use of facial recognition technology by law enforcement agencies, particularly the FBI.
What Facial Recognition Is — and Isn’t
TV cop shows that depict technologists zooming in on someone’s face in a grainy video, pressing a few buttons and immediately getting a full dossier on that person greatly exaggerate the current capabilities of facial recognition technology.
In a paper published in the December 2012 edition of the journal IEEE Transactions on Information Forensics and Security, four authors — including a senior photography technologist for the FBI — reported that facial recognition systems are less accurate in distinguishing identities among African-Americans, women and younger people.
The FBI system, in particular, “is not designed to give no for an answer,” Alvaro Bedoya, executive director of the Georgetown Privacy Center, said last month on the public radio podcast Criminal Injustice.
“No matter what, it will return a list of faces. And so, in these systems that are designed to not tell you no for an answer, when they miss the right suspect, they’re still going to give you a list of potential suspects that look like the candidate image,” he said. “And those innocent people will predominantly be African-Americans, women and young people.”
The GAO report found that facial recognition systems will frequently return “prime candidate” profiles based on just one or two photos, offering only one or two angles,
The federal government’s own guidelines, set out by the National Institute of Standards and Technology, suggest using at least five images to determine a credible match. And if a subject is wearing “accessories that occlude facial features” — eyebrow studs or rings through the nose, for example — images should be obtained both with and without them.
In a letter to the Justice Department in October, a coalition of civil liberties and privacy groups contended that “such inaccuracies raise the risk that, absent appropriate safeguards, innocent African-Americans may mistakenly be placed on a suspect list or investigated for a crime solely because a flawed algorithm failed to identify the correct suspect.”
‘Technology Will Not Wait’
And yet, according to the GAO report, there is little independent testing for errors. Two of the major companies providing such systems, in fact, have said they don’t run such tests internally, either.
Internal FBI documents obtained in a Freedom of Information Act lawsuit by the nonprofit Electronic Privacy Information Center indicate that the FBI’s own database, called the Next Generation Identification Interstate Photo System, or NGI-IPS, had an acceptable margin of error of 20 percent — that is, a 1-in-5 chance of “recognizing” the wrong person.
And research published in the October 2015 issue of the scientific journal PLOS ONE by researchers at the universities of Sydney and New South Wales in Australia found that the humans who interpret such data build in an extra error margin approaching 30 percent.
Even so, the FBI is working to grow the number of state and local law enforcement agencies whose databases it can tap into. It already has agreements with 16 states allowing investigators to cross-check faces without court warrants, creating what the Georgetown Privacy Center called a “virtual perpetual lineup.”
“What the FBI is doing right now is creating a national biometric network that is primarily made up of law-abiding Americans,” Bedoya said on the Criminal Injustice podcast. “That’s a fundamental shift, and we think a problematic one.”
“Technology will not wait for us to answer these questions. Neither will law enforcement,” the Georgetown report last year concluded.
“It is time to enact 21st-century privacy protections for a 21st-century surveillance technology.”