Monday, April 28, 2014

What You Should Know About FBI's Giant Biometric Database

What You Should Know About FBI's Giant Biometric Database

The FBI's Next Generation Identification is expected to house 52 million photos searchable by face recognition technology by next year.
  •  
  •  
  • digg
  •  
  •  
  •  




April 22, 2014  |  

Since 2008, the FBI has been hard at work transforming its massive fingerprint database (IAFIS) into an even more massive biometric database called Next Generation Identification. NGI will include iris scans, palm prints and images of faces that can be scanned using face recognition technology and matched to age, race, address, ID number, and immigration status, among other things. 

According to documents obtained by the Electronic Frontier Foundation, the FBI's efforts to amass as much identifying data as possible is going quite well: the database, which contained 13.6 million images of 7 to 8 million people in 2012, is expected to hold 52 million photos by next year. Most of the images come from local and state law enforcement. Although the FBI has said it doesn't intend to collect social media images, there are no rules in place barring the accumulation of pictures from sources other than law enforcement. 

AlterNet spoke with Jennifer Lynch of the Electronic Frontier Foundation on the privacy dangers posed by NGI. 
Tana Ganeva: What's the goal of Next Generation Identification? Why aren't fingerprints good enough? 
Jennifer Lynch: I think it's a good question about why fingerprints aren't good enough. I think you'd have to talk to the FBI to get their opinion on that. My sense is that a possible reason is that other types of biometrics can be captured without having to come into actual physical contact with a person. For example, with face recognition you can take a picture of somebody even without their knowledge and you can do it from a certain distance. 

The NGI database is also including iris scans and palm prints. Iris scans are already being used inside prisons to monitor prisoners as they move from location to location within the prison. One of the reasons is because you can collect an iris scan without having to be in the same physical space as the prisoner. So that could be one reason. The FBI has said in the past that having more biometrics, different types of biometrics allows them to identify people better, but I don't know if that justifies having so much biometric information on people. 

Some of it might also be driven by the information-sharing environment within the federal government. So the Department of Defense has a database that's somewhat interoperable with the FBI's database. And DoD has been collecting multiple forms of biometrics, so that could also be driving the change. 

TG: Are there any barriers to how they can share the information?

JL: It's a good question. but I don't think there are any blocks, certainly not with DoD sharing information with the FBI. Now, DoD is not supposed to be collecting information on US citizens, so I think there would be a real issue if they had unfettered access to the FBI's database. 

TG: What about the role of local and state law enforcement?

JL: Probably most of the records are coming from local law enforcement—18,000 tribal, state and local enforcement agencies across the country that are providing biographic and biometric information to the FBI's database. So every time a mugshot's taken by one of these local agencies, it gets uploaded and stored by the FBI in addition to being stored in whatever state level criminal database there might be. So a lot of the data that's already in the FBI's database is coming from state and local law enforcement. They can also access it. 

We've gotten some memoranda of understanding between the FBI and some of the states. These are states ... that already were using face recognition in their criminal mugshot databases and those states have volunteered to turn over their whole database of mugshots to the FBI, which all of the sudden gets a whole state's database of biographic and biometric information. 

TG: One of the more seemingly disturbing developments is that they're collapsing the barrier between the criminal and non-criminal sections. Why do you suppose they're doing that and what problems might arise?

JL: I think what they're trying to do is just come up with a uniform classification system. So it seems like just an outgrowth of trying to classify information. 

But the problem is that then of course people who've had to supply their biometric information for a non-criminal purpose are having their information searched every time there's a criminal search. And if you're providing your information just to get a job or to do a background check, that shouldn't go into some sort of criminal database to be scanned every time a law enforcement investigation is taking place. 

TG: The estimated breakdown between criminal and non-criminal images is about 46 million to 4 million. But even the so-called "criminal images" are just arrestees, so anyone who's had a run-in with police would be there, like low-level offenders, people picked up in a protest, etc.

JL: Yeah, that definitely puts you in the criminal database. Another thing we've seen is the increased use of mobile biometric collection devices. So it's happened mostly with mobile fingerprint scanners but in a few different areas, most notably in San Diego county, a bunch of law enforcement agencies are using tablets to take pictures of people right on the street and I think the big issue with that is that it takes a lot for an officer to go out on the street, find somebody doing criminal conduct and bring that person in the station, have a booking photo happen. It takes almost nothing to stop a person on the street and take their picture. And you take out all those operational safeguards that might stop an officer from collecting a facial recognition image. 

You increase the risk of racial profiling and some of the issues we've seen with, for example, stop-and-frisk in New York. 

TG: That's interesting, because one could argue that this is just your picture in a database somewhere, unlike the more physical violent intrusions like stop-and-frisk. What are some of the ways something like this could really, concretely, harm somebody?

JL: So, face recognition is not infallible. Under controlled lighting conditions and when the angle is controlled of a person's face, it's relatively accurate at identifying somebody. But if you're trying to compare a picture of a person on the street in a surveillance camera photo with a mugshot in a database there's a lot of false positives. A lot of people listed as looking like that person, but who were never anywhere near the crime scene. 

It means those people now have to worry about defending their innocence. What if you were sleeping alone that night? Or doing something that's not criminal, but maybe you don't want people to know about and all of the sudden you have to give that information to the police just to defend your innocence. 

TG: Beyond the spooky sci-fi aspect, why would you say face recognition is more alarming than other biometrics?

JL: I think for a few different reasons. One is that it's not really accurate at matching face image to face image as fingerprint to fingerprint. But the other reason is that it's possible to capture a face image at a distance. It's very easy. And there are surveillance cameras everywhere. And once face recognition technology improves, it will be possible to monitor people as they travel through society. It would be impossible to be anonymous in society anymore. And it has a chilling effect on people's speech, and activities, and their willingness to engage in discussion, religious discussion, associate with people they might not know.

And we've seen that with Muslim communities in New York, there's been some great research done on this and facial recognition just increases the risks. 

TG: The FBI's response to concerns about privacy abuses and dangers is that it's harmless because it just generates investigative leads, not concrete findings. 

JL: It seems to be their argument. I think it's a little bit naive because what humans tend to do is trust in technology and think that technology is going to solve all of their problems and believe despite any type of disclaimers that a computer is going to identify the right candidate. So I think that technology doing the identification presents risks. 

TG: Are there any efforts to rein in or regulate this technology?


JL: We hoped that Sen. Al Franklin was going to move forward with some limitations of how the FBI could collect the data, but it appears he's more concerned about private collection of face recognition data. The other thing is that it's basically impossible to get anything through Congress these days.

No comments: