NYPD uses obtuse language to avoid revealing what surveillance, biometrics it deploys
The first report by the Office of the Inspector General for the New York Police Department (OIG-NYPD) since the 2020 Public Oversight of Surveillance Technology (POST) Act, which requires the force to publish Impact and Use Policies (IUPs) of its surveillance and biometrics activities, finds that the “NYPD has largely complied with the Act’s requirements,” but improvements are needed to make it clearer what the department is actually doing with what technology.
The force’s current approach could even allow it to deploy new technologies such as the Digidog robot without even informing City Hall.
The words ‘unclear’ and ‘vague’ appear throughout describing how the force has been working to the letter rather than the spirit of the law. The NYPD’s IUPs were found to be so lacking in detail that the OIG-NYPD cannot conduct full analyses to provide full transparency to the public of how technologies are used.
The force has spent $159 million on surveillance tech with a dedicated ‘slush fund’ since 2007, rights groups have reported. Also, without requiring City Hall approval.
The assessment includes 15 recommendations intended to help the NYPD improve its transparency for the next annual report, one of which has set another six-month countdown clock ticking to create a working group of City Council members, and advocacy and community group members with expertise in surveillance to help improve those IUPs, which the report finds mainly to be a rehash of existing policies.
The POST Act defines surveillance technology as “equipment, software, or systems capable of, used or designed for, collecting, retaining, processing, or sharing audio, video, location, thermal, biometric, or similar information, that is operated by or at the direction of [NYPD].” Then for each qualifying technology, the force must publish an IUP covering ten areas including the technology’s capabilities, the force’s rules of access, whether court authorization is required, data retention and what access external agencies have.
For example, for the IUP on facial recognition technology, the OIG finds it “largely complies with the Act’s requirements, but provides minimal information about NYPD’s uses of this surveillance technology, its data sharing and retention practices, oversight of the handling of data generated by this technology, and the potential disparate impacts of its applications.”
The way in which different agencies make requests and originally submitted images are altered for conducting searches means the OIG is not able to audit the processes.
The key findings are arranged into three sections, the first being “NYPD Uses Vague, Non-Specific Boilerplate Language Throughout the IUPs.” 83 percent of the IUPs use the same language in their “Rules, Processes, and Guidelines Relating to Use” section, notes the report.
For all categories of surveillance technologies, in the sections on external entities’ access to data, the NYPD has copied and pasted the exact same text summary without ever being specific: “Government agencies at the local, state, and federal level, including law enforcement agencies other than NYPD, have limited access to NYPD computer and case management systems. Such access is granted by NYPD on a case-by-case basis subject to the terms of written agreements between NYPD and the agency receiving access to a specified system.”
The OIG finds this within the letter of the POST Act but “it is so broad and general that it fails to convey to the public any specific information about the agencies that can access the relevant data.”
Likewise with third-party ownership of some of NYPD’s surveillance tech, including some facial recognition tools, “it is possible that data generated by certain technologies may be owned, shared, and sold by the third-party owners of the technology, overriding NYPD’s control of data sharing and access.”
The OIG wants every entity to be individually named in the new IUPs.
The next section, “NYPD Has Interpreted the Requirement to Include Information About Potentially Disparate Impacts in a Narrow Manner,” states that only five of the 36 IUPs provide the potential disparate uses. This arises from the NYPD choosing to interpret the requirement as meaning the disparate impact of the IUP itself and not the impact of the use of the technology in question.
“NYPD Has Grouped Related Tools Together in a Way That Limits Public Oversight,” the third set of findings, states how the NYPD grouped similar technologies rather than address each one individually. These groupings, such as Data Analysis Tools, Audiovisual Recording Devices, and Situational Awareness Cameras helped the force meet its first six-month deadline to provide IUPs, the force said.
The OIG finds that this could shield individual technologies from public scrutiny. For example, “there is no individual IUP for Digidog, a robot in the form of a dog with mounted microphones and cameras, which NYPD piloted in live operations on several highly publicized occasions. Digidog was grouped into the IUP for Situational Awareness Cameras.”
Any other similarly mobile surveillance tools could be rolled out under the same IUP and if the NYPD did not conduct the same media blitz with those, the public would have no idea they were in use. It would not have to inform the public or even City Council
The OIG wants an individual IUP for each technology. The NYPD has 30 days to provide a list of all technologies used, 90 days to detail what data is collected by them and which NYPD units maintain that data.
It has 180 days to create a way to track how it shares surveillance data with external agencies.