Democracy Gone Astray

Democracy, being a human construct, needs to be thought of as directionality rather than an object. As such, to understand it requires not so much a description of existing structures and/or other related phenomena but a declaration of intentionality.
This blog aims at creating labeled lists of published infringements of such intentionality, of points in time where democracy strays from its intended directionality. In addition to outright infringements, this blog also collects important contemporary information and/or discussions that impact our socio-political landscape.

All the posts here were published in the electronic media – main-stream as well as fringe, and maintain links to the original texts.

[NOTE: Due to changes I haven't caught on time in the blogging software, all of the 'Original Article' links were nullified between September 11, 2012 and December 11, 2012. My apologies.]

Wednesday, June 29, 2016

The FBI Wants to Exempt Massive Biometric Database From the Privacy Act

A broad coalition of 45 signatories, including civil liberties, racial justice, human rights, and privacy organizations, published a letter Tuesday strongly condemning a proposal by the FBI to exempt its massive biometric database from certain provisions of the Privacy Act. Known as the Next Generation Identification system, or NGI, the FBI database houses the world’s largest collection of fingerprints, DNA profiles, palm prints, face images, and other biometric identifiers. The letter, signed by groups such as La Raza, Color of Change, Amnesty International, National LGBTQ Task Force, as well as the companies Uber and Lyft, criticized the agency’s May 5 proposal on the grounds that the “system uses some of the most advanced surveillance technologies known to humankind, including facial recognition, iris scans, and fingerprint recognition.”

Specifically, the FBI’s proposal would exempt the database from the provisions in the Privacy Act that require federal agencies to share with individuals the information they collect about them and that give people the legal right to determine the accuracy and fairness of how their personal information is collected and used. The exemption could render millions of records unavailable to subjects. As of December 2015, the NGI system contained 70,783,318 criminal records and 38,514,954 civil records.

As the coalition notes with alarm, the database stores millions of unique identifiers for U.S. citizens who have not been convicted of a crime alongside those who have. Fingerprints taken for an employer’s background checks, for instance, can be stored and searched in the NGI’s system along with those taken for criminal investigations.

“What troubles me about the NGI is that the previous systems used to be focused on criminal identification and so had somewhat of a limited impact,” said Mike German, a fellow with the Brennan Center for Justice’s Liberty and National Security Program and a former special agent with the FBI. “Obviously we have huge problems with criminalization and arrest without probable cause that would make those databases overpopulated,” German added. “But the fact that they have expanded them to non-criminal information and a multitude of other purposes increases likelihood of unintended harms.”

A “systems of records notice” published by the FBI at the same time as its proposed exemption notice explains that the NGI system collects data from individuals in a range of settings — including state departments of motor vehicles, volunteer and welfare screenings, and visa applications — and stores their records until they turn 110 years old. The NGI was launched in 2008, in order to update, centralize, and expand the bureau’s biometric collection systems.

Several civil liberties advocates, all of which signed the letter, told The Intercept that allowing the FBI to evaluate privacy at its “sole discretion,” as the notice suggests, shields the NGI database from oversight, accountability, and transparency. Jeramie Scott, a national security attorney who helped litigate the Electronic Privacy Information Center’s lawsuit against the FBI for documents pertaining to its NGI system in 2013, said the exemption “makes it harder for people to understand what the FBI is using this data for, to access this data to make sure its correct, and to have some type of civil remedy if, because of the FBI’s NGI database, they are somehow harmed by the FBI’s use of that data.”

Allowing subjects to view their own records under the Privacy Act ensures that they can be corrected for accuracy, explained ACLU senior policy analyst Jay Stanley. “It doesn’t seem too much to ask that the FBI make records subject to people with timeliness and fairness to the individual,” Stanley said. With the exemption, “they are maximizing their ability to act without oversight but they risk leaving victims of inaccuracies out in the cold with no remedy.”

Compounding the problem of inaccurate data is the fact that the biometric technologies that generate the NGI system’s data have themselves been shown to be inaccurate. Take face recognition technology, which the NGI incorporated into its system in 2014. While the technology’s error rate has significantly decreased from the early 1990s, documents obtained by EPIC revealed that the FBI was willing to accept a 20 percent error rate for its face recognition technology as of 2010. “The FBI is looking at a lot of different people who shouldn’t be the subject of an inquiry and making a decision about whether they should be the subject of an inquiry based on face recognition searches,” Scott explained.

A system as capacious as NGI can actually raise the odds of becoming a false positive. Studies have found that the error rate for face recognition systems increases, logarithmically, as the size of the dataset grows. Those odds are even worse for people of color, who are both overrepresented in the FBI’s criminal datasets and potentially subject to technological bias. Since African-Americans are incarcerated at almost six times the rate of white people, they likely constitute the majority of faces represented in the FBI’s collection of millions of criminal mugshots and photographs of incarcerated people. “Error rates are highest for people in groups who are overrepresented in that system,” said Alvaro Bedoya, director of the Center on Privacy & Technology at Georgetown. Subjecting people of color to disproportionate searches, in other words, increases the risk of false positives. “It’s so flawed the way they manage it that errors are guaranteed,” German said. “We’re talking about deprivation of liberty as a result.”
Collect It All

The FBI’s proposal states that even if individuals are not subject to current law enforcement activities, their data may have future uses, such as “establishing patterns of activity and providing criminal leads.” “It is impossible to determine in advance what information is accurate, relevant, timely and complete,” the proposal continues. “With time, seemingly irrelevant or untimely information may acquire new significance when new details are brought to light.”

The FBI’s claims about “untimely” or irrelevant data reflect the agency’s larger attitude toward digital data collection over the last decade, according to German. The attitude, he said, is that “all data is valuable and there is some future magical algorithm that will make sense of it so it must be retained. Even when they have erroneous information, they are very reluctant to destroy information, so you have a number of problems they have with how they handle data.”

Agency documents further reveal that the FBI has long been aware of the civil liberties concerns surrounding its biometrics collection. Since at least 2006, the agency has issued its own calls for a “concerted dialogue” to address public anxieties and privacy laws. In 2011, a series of forums co-sponsored by the FBI’s Biometric Center of Excellence sought to address the “lack” of laws, guidelines, and policies governing facial recognition technology. In dozens of talks, federal law enforcement officials, legal experts, and technologists discussed in depth many of the same concerns raised by the coalition’s letter, including the need for transparent policies for deploying facial recognition technology, privacy concerns related to the collection of such data by law enforcement, and the storage of personal identifiable information under the Privacy Act.

The coalition’s letter calls for the Department of Justice to extend its 30-day window for comments on the proposed notice. The DOJ confirmed in an email to The Intercept that it has extended the period for public comment through July 6.

“When you build the world’s largest database, special responsibilities come with that and one of those responsibilities is to be as transparent as possible about the system,” Bedoya said. “The FBI has gone about this the opposite way by failing to file mandatory notices for years. What they are finally proposing is even less transparency.”

Original Article
Source: theintercept.com/
Author: Ava Kofman

No comments:

Post a Comment