Facial recognition legislation delayed in New Orleans, considered in Pennsylvania
A proposed ban on the use of facial recognition by city agencies in New Orleans will not be put to a vote this week, and will next be considered on November 5, according to The Lens.
The delay appears to have been caused by the need to have updates to the proposal approved by the mayor’s office.
In addition to banning city departments from using facial recognition, which local police say they do not use, the ordinance also bans “characteristic recognition,” and cell-tower simulators commonly known as ‘stingrays.’ The new version adds predictive policing software to the list.
A letter signed by 34 different organizations, including Orleans Public Defenders and Eye on Surveillance urged City Council to pass the ordinance, which was originally proposed in July. The letter argues that “Surveillance does not equal safety,” and that racial bias has historically been built into surveillance technologies.
The Lens also reports that several council members reacted skeptically to the restrictions, saying there was greater support for more surveillance in the system from their constituents. Since then, the ordinance had been updated, with a ban on automatic license plate readers removed along with a requirement for explicit permission for the use of any piece of surveillance equipment. The original had also limited the length of any approvals granted, and imposed annual reports on how technologies were used and the demographics they were used on.
Legislation considered at two levels in Pennsylvania
Allegheny County Council is considering a bill which would require it to approve any solicitation, contract or use of facial recognition and other surveillance technologies, and the creation of public policy on how they can be used, according to PublicSource.
The bill states that “The benefits of using face surveillance, which are few and speculative, are greatly outweighed by its harms, which are substantial,” citing potential harms to free speech and demographic differences. The bill was written with the support of the ACLU of Pennsylvania and Carnegie Mellon University faculty members.
Two law enforcement agencies have used facial recognition technology within the last two years, PublicSource writes, but the District Attorney’s office said it does not currently do so, and the Sheriff’s office decided against the implementation of facial recognition to a County Courthouse. Four employees of the Allegheny DA’s Office had trial accounts with Clearview AI, PublicSource writes in another article.
The legislation includes an exception for major emergencies, and tasks the county manager with writing an annual report on any surveillance technology used.
A potential vote on the bill is expected to be months away.
The office of Pennsylvania Attorney General Josh Shapiro used Clearview’s controversial facial recognition app on a trial basis, but prosecutors did not use it in any cases
The publication obtained email records which show that a total of ten AG employees received trial logins, and at least three were used, but the service was never contracted by the office.
At least one local force in the state has a paid account with Clearview, and Philadelphia Police ran around 1,000 checks with it during a trial period.
Jeffrey N. Rosenthal and David J. Oberly of Blank Rome’s biometric privacy practice write for Law.com that Pennsylvania’s proposed Consumer Data Privacy Act (CDPA) (H.B. 1049) would directly impact many businesses that collect and use the biometric of state residents, and open up significant class action liability.
The proposal applies to businesses that make more than $10 million in gross revenue, share the personal information of 50,000 or more people, households or devices, or derive half or more of their revenue from the sale of personal information. Those companies would be responsible for extensive disclosure requirements, and compliance with consumer rights such as disclosure of use and data deletion. Data security and employee training requirements would also be created.
Data breaches would trigger a right of private action with statutory damages of up to $750 per incident, and other violations would be enforced by the state Attorney General.
The attorneys compare CDPA to California’s CCPA and provide tips on compliance for concerned organizations.
Utah Department sets internal regulations
Though an attempt to legislate restrictions on face biometrics in Utah failed earlier this year, the state’s Department of Public Safety (DPS) has announced a new procedure for responding to outside data requests to increase transparency, GovTech writes.
The new procedure is intended to be more auditable, and DPS staff will also take mandatory training in implicit bias.
The Department’s facial recognition system is used about 40 times per month. The use of driver’s license photos for facial recognition in Utah remains hotly debated.
Portland ban not enough for some
Some in Portland, Maine are not satisfied with the city’s recent passage of a ban on the use of facial recognition by local police and businesses, and are advocating for legislation to prevent it from being used in court, and to strengthen enforcement of the existing ban, local outlet WGME reports.
People First Portland and a city councillor say face biometrics are racist, and have placed a referendum on the November 3 ballot to strengthen the local measures against it.
Their proposal also includes the right to sue for violations, with individuals eligible for at least $1,000, and city employees could be suspended or terminated for violating the restrictions, according to TulsaWorld.
San Francisco cameras cause concern
In San Francisco, the Castro/Upper Market Community Benefit District (Castro CBD) has pushed back a vote on whether to accept a private grant to install a network of security cameras, according to Hoodline.
Tech entrepreneur Chris Larsen had proposed to give the organization $695,000 to install 125 cameras. Larsen has already bankrolled more than a thousand cameras in the city.
One critic of the proposal suggested that though facial recognition is banned from local police use in the city, the cameras are powerful enough to perform iris recognition.
Meanwhile, activists have sued the city, alleging they illegally used a network of more than 400 cameras to track protestors against police violence following the death of George Floyd, The San Francisco Chronicle reports.