Competing safety and rights concerns motivate facial biometrics deployments and pushback
Officials with New York’s Metropolitan Transportation Authority are considering their options for deploying security cameras, possibly with facial recognition capabilities, throughout the transit system to deter and catch criminals. Criticisms of public deployments of facial biometric technology, however, are coming from several different levels.
CBS New York reports that MTA Finance Chair Larry Schwartz will demand the new capital plan for the system include tens of millions of dollars for a system-wide surveillance system. Schwartz describes a series of frightening incidents occurring recently in the MTA system, often involving mentally unstable individuals, and says he wants the public and MTA employees to feel safe.
Schwartz emphasizes the deterrence provided by cameras, though the mental state of the criminals in the examples provided by CBS casts doubt on the rationality of their actions.
“What’s important is to come up with a smart strategic plan … to figure out where to place these cameras, the type of cameras, including whether or not these cameras should have facial recognition,” Schwartz told CBS.
The New England Patriots have been using Evolv Technology’s Evolv Edge screening gates with facial recognition and millimeter wave technology to screen people entering Gillette Stadium since 2017, according to a post to the MassPrivatel blog.
MassPrivatel also reports the same technology has been in use at TD Garden since the same year. A case study by Evolv suggests fan screening system has improved security and customer experiences, but the blog post notes that since private companies are exempt from freedom of information access requests, people may not even know they are on a blacklist until they have been denied entry for violating the venue’s Code of Conduct.
Officials call for change
U.S. Congressional Representative Don Beyer (D-VA) meanwhile has suggested an amendment to legislation before the House to bar federal funds from being used by state and local police to purchase facial recognition technology, and for the National Science Foundation (NSF) to report to Congress on the social impacts of research it funds into AI. The pair of amendments were offered for a House appropriations bill intended to guide federal policies for artificial intelligence and facial recognition.
“It is important that Congress recognize not only the exciting potential of technologies associated with artificial intelligence, but also the significant risks and responsibilities which come with them. For instance, facial recognition systems already being adopted by big city police departments and used to swear out warrants have shown significant levels of inaccuracy and bias. Artificial intelligence and predictive algorithms more generally are increasingly influencing hiring decisions, credit and loan determinations, and even criminal sentencing, even as such systems remain woefully susceptible to longstanding biases.
“There is a growing body of data that suggests the technology being deployed today is not ready for such widespread operation. We need more research and better standards of use before we entrust such important aspects of our society entirely to as-yet flawed automated processes, and my amendments would do just that – one placing a year-long moratorium on federal funding for state and local law enforcement purchase of facial recognition technology while Congress works to set parameters, and the other encouraging social science research in this space in the meantime.”
Beyer says the proposals address concerns about due process, privacy rights, and freedoms of speech and assembly that Congress is required by the Constitution to protect.
The United Nation’s Special Rapporteur on freedom of opinion and expression David Kaye has called for an immediate moratorium on the dissemination of surveillance technology in a report to the Human Rights Council in Geneva.
“Surveillance tools can interfere with human rights, from the right to privacy and freedom of expression to rights of association and assembly, religious belief, non-discrimination, and public participation,” the Special Rapporteur said in statement. “And yet they are not subject to any effective global or national control.”
Kaye says sophisticated surveillance tools, including facial recognition systems, have been used to track journalists, politicians, UN investigators, and human rights advocates, and have been linked to torture, and possibly extrajudicial killings. While most states are largely responsible, according to Kaye, the lack of global and national controls has created conditions for abuses.
“The private surveillance industry is a free for all,” Kaye argues, “an environment in which States and industry are collaborating in the spread of technology that is causing immediate and regular harm to individuals and organisations that are essential to democratic life – journalists, activists, opposition figures, lawyers, and others. It is time for governments and companies to recognise their responsibilities and impose rigorous requirements on this industry, with the goal of protecting human rights for all.”
Kaye recommends states adopt domestic safeguards against unlawful surveillance, based on international human rights law, that they develop publicly owned mechanisms for approval and oversight, and that they strengthen export controls and legal redress assurances.
Article Topics
artificial intelligence | best practices | biometrics | Evolv | facial recognition | standards | surveillance
Comments