March 29, 2024

Our Days of Noah

~THE WORLDWIDE FLOOD WAS NEVER THE PERMANENT SOLUTION~

THE FBI’S FACIAL RECOGNITION DATABASE COMBINES LO-RES PHOTOS WITH ZERO CIVIL LIBERTIES CONSIDERATIONS



Another FOIA lawsuit brought against the government by the EFF has resulted in the release of previously withheld documents. The papers cut loose this time detail the FBI’s facial recognition database and other parts of its “Next Generation Identification” (NGI) program, one that aims to compile a collection of biometric data. 

EPIC’s FOIA lawsuit over similar information revealed last year that the FBI’s facial recognition software (as of 2010) had an acceptable margin of error of 20%. With a 1-in-5 chance of “recognizing” the wrong person, the accuracy of the database had nowhere to go but up. But it appears the FBI prioritizes quantity over quality, as the first number to hit you from the “released” documents is a big one.

The records we received show that the face recognition component of NGI may include as many as 52 million face images by 2015. By 2012, NGI already contained 13.6 million images representing between 7 and 8 million individuals, and by the middle of 2013, the size of the database increased to 16 million images. The new records reveal that the database will be capable of processing 55,000 direct photo enrollments daily and of conducting tens of thousands of searches every day.

The millions of images come from a handful of sources. Only 46 million of those images, however, will be from criminal databases. The other 6 million will come from other sources, not all of those necessarily related to criminal or terrorist activity.

[T]he FBI does not define either the “Special Population Cognizant” database or the “new repositories” category [which account for nearly a million images]… 

A 2007 FBI document available on the web describes SPC as “a service provided to Other Federal Organizations (OFOs), or other agencies with special needs by agreement with the FBI” and notes that “[t]hese SPC Files can be specific to a particular case or subject set (e.g., gang or terrorist related), or can be generic agency files consisting of employee records.”

These employee records may be tossed into the database along with the criminal records if the FBI chooses to assign these a Universal Control Number (UCN). And these records may become more common. As the EFF points out, if you submit your fingerprints as part of a pre-employment background check, these are added to the FBI’s database. If employers decide they also want a pre-employment mug shot, that will head the FBI’s way as well. 

The database will be populated with non-criminal photos and overseen by an agency that hasn’t provided an updated Privacy Impact Assessment for its facial recognition program since 2008. The low resolution (often at 0.75 megapixels or less) makes this blending of hit/non-hit photos even more problematic, as it means the FBI’s actual accuracy rate still hovers between 80-85%. But the agency has weasel-worded its way out of having to defend such a lousy accuracy rating.

[T]he FBI has disclaimed responsibility for accuracy, stating that “[t]he candidate list is an investigative lead not an identification.” 

Because the system is designed to provide a ranked list of candidates, the FBI states NGI never actually makes a “positive identification,” and “therefore, there is no false positive rate.”

The FBI generates a “top 50 candidates” report from searches, which it claims is only an “investigative tool,” not a starting point for any investigation. That’s some remarkably devious dissembling. The agency won’t ever be wrong because it’s not even trying to be right! 

So, how exactly is this supposed to aid in investigations, if the best results are a grab bag of low-res photos dredged from a variety of sources, some of them non-criminal? The FBI doesn’t say. All it says is that the “true candidate” will show up on the “top 50 list” 85% of the time — and then only if the “true candidate” is already present in the database. The EFF asks the question the FBI hasn’t asked itself, or at least hasn’t shown any interest in answering honestly.

It is unclear what happens when the “true candidate” does not exist in the gallery—does NGI still return possible matches? Could those people then be subject to criminal investigation for no other reason than that a computer thought their face was mathematically similar to a suspect’s?

The FBI’s “answer” shifts all the accountability to other law enforcement agencies.

[T]he Bureau notes that because “this is an investigative search and caveats will be prevalent on the return detailing that the [non-FBI] agency is responsible for determining the identity of the subject, there should be NO legal issues.”

The FBI, which hasn’t updated its privacy protections in a half-decade, which knows that a majority of the photos in its database have a resolution only slightly above “useless” and which sees no problem with throwing photos of criminals and non-criminals into the same database, still has yet to see any significant pushback on its NGI expansion from anyone tasked with overseeing the agency. The fact that these documents were forced free via a FOIA lawsuit shows the FBI has no interest in sharing this info with the public. As for our representatives — they either don’t know or don’t care, neither of which should make the represented happy. 

This program has some very serious issues, and it’s only going to get worse unless someone outside the FBI intervenes. It’s obvious from its caveat emptor-esque “policy” (“not our fault if you arrest the wrong pixelated suspect”) governing law enforcement’s use of the intermingled good guy/bad guy database that it has no interest in policing itself. 

Source: techdirt.com

Our Days of Noah