NYPD sued for refusing to disclose face recognition technology docs
The Center on Privacy & Technology (CPT), a university think tank at Georgetown Law, announced Tuesday that it filed a Freedom of Information lawsuit against the New York Police Department (NYPD) after the department refused to disclose documents relating their long-term use of facial recognition technology.
Facial recognition technology uses algorithms to analyze images of human faces and match them with photos in a database containing such images as driver's license photos, passport photos, police records and even public photos posted to social media or dating sites.
In January, the Center on Privacy & Technology (CPT) filed a Free of Information Act (FIOA) request with the NYPD for records relating to their facial identification unit. In response, the NYPD sent the CPT a single memo on procedures relating to the technology. The department claims no other records could be found.
“The department’s claim that it cannot find any records about its use of the technology is deeply troubling,” said David Vladeck, the CPT’s faculty director. “The NYPD has been using face recognition for over five years. New Yorkers have a right to know how it’s using face recognition technology.”
In March, a former NYPD official who helped establish the facial identification unit told the New York Daily News that the department had conducted “more than 8,500 facial recognition investigations, with over 3,000 possible matches, and approximately 2,000 arrests” since the program started in 2011.
In October, New York Governor Andrew Cuomo (D) announced the city would begin installing advanced cameras and sensors with facial recognition software into the design of its bridges, tunnels, airports and other transit hubs to “ultimately develop one system-wide plan.”
There are currently no state or federal laws that control the NYPD’s use of facial recognition technology. The documents that the CPT requested included their policies, manuals, user guides, training materials, contract obligations, audits, agreements and other documents would then be the only source of oversight on how the technology is used by the department.
“If no records exist, that means that there are no controls on the use of face recognition technology and we ought to worry about that,” the CPT’s Vladeck said.
The information request from the CPT was part of a year-long study on how law enforcement agencies use facial-recognition technology, entitled “The Perpetual Lineup.”
The October study found that more than 117 million American adults are enrolled in a criminal facial recognition network and one-fourth of all law enforcement agencies in 26 states have access to this database. However, the study found that “few agencies have instituted meaningful protections to prevent the misuse of the technology. In many more cases, it is out of control.”
In March, the US Government Accountability Office released a study of face recognition technology, which found the FBI “had not fully adhered to privacy laws and policies and had not taken sufficient action to help ensure accuracy of its face recognition technology.”
“Face recognition is too powerful, and its price on privacy and civil liberties too high, to not be controlled by robust policies and training guides. If these records do in fact exist, it is against both New York law and the interests of the public to keep them secret,” Clare Garvie, the associate at the Center on Privacy & Technology who filed the original document request said.
The CPT study also found the FBI had not conducted enough tests to assess the accuracy of the technology. While they claimed their technology could return a match at least 85 percent of the time, GAO reports that the FBI only tested their technology with a candidate list of 50 potential matches. In those tests, the FBI did not report the false positive rate, or how often the technology matched a person with the wrong photo in a database.
The study from CPT found the technology is less accurate than fingerprinting, and less accurate when used to identify African Americans. This means the technology could make a mistake and an innocent person could be investigated or charged with a crime they did not commit.
In 2015, Sergeant Edward Coello of the NYPD facial identification unit told WNBC they had identified 1,700 suspects and made nine arrests using the technology. But he also admitted that it had “misidentified” five people.
"Innocent people don't belong in criminal databases," Alvaro Bedoya, the executive director of the CPT and co-author of the study, said, according to ARS Technica. "By using face recognition to scan the faces on 26 states' driver's license and ID photos, police and the FBI have basically enrolled half of all adults in a massive virtual line-up.”