FB pixel

NY school district allegedly misled about face biometrics system’s racial bias

Governor yet to sign off on moratorium passed five months ago
Categories Biometrics News  |  Facial Recognition  |  Schools

face-biometrics-privacy-children

A facial recognition system purchased by New York’s Lockport School District has greater disparities in performance between demographics than claimed by its supplier, and its weapons detection feature has returned an unknown number of false alerts, Vice Motherboard reports.

Black students are misidentified much more frequently than supplier SN Technologies suggested to district officials, according to document obtained by Motherboard.

The AEGIS system also mistakes broom handles for guns, among other performance issues. Given that the system automatically alerts police if it detects a weapon in the school, one local parent told the publication that the chances of a tragic misunderstanding far outstrip the chances that the goal of preventing a violent incident in the school would be achieved.

Lockport School District turned on the AEGIS system at the beginning of this year, after the New York State Education Department (NYSED) reversed its earlier decision to block the launch, pending a set of policy changes.

The State legislature passed a two-year moratorium on new implementations of the technology in schools in response.

The emails obtained by Motherboard show SN Technologies’ CEO KC Flynn telling district officials in August of 2019 that the id3 Technologies algorithm AEGIS uses was ranked 49th out of 139 tested by NIST for racial bias.

“Those numbers don’t tally with our numbers. They’re not even close. What id3 sent to NIST is not what these people are talking about,” NIST Biometric Testing Lead Patrick Grother told Motherboard.

Grother also objects that the explanation of how biometric accuracy is calculated by Flynn is “nonsensical,” and says that he warned a lawyer for the school district of the inconsistencies when he was approached in 2019.

Accountancy firm Freed Maxick was hired by the district to audit SN Technologies’ claims, and reported back that the id3 algorithm misidentifies black men four times more often, and black women 16 times more often than white men, though SN Technologies had claimed those misidentifications are 2 and 10 times more common, respectively. The system has also produced a series of false positive weapons detections, prompting SN Technologies to build a new database of broom handles, and the software to be updated.

The district and police department declined to provide Motherboard with information about possible incidents prevented, or false alarms, attributed to the system.

The NYCLU is supporting parents in a lawsuit against the facial recognition and weapons detection deployment, and also says other districts are attempting to source similar technology without the appropriate safeguards in place.

Meanwhile, the Tenth Amendment Center reports that New York Governor Andrew Cuomo is still yet to sign the bill placing a moratorium on the use of facial recognition in schools, despite its passage by the state legislature in July.

The bill passed the state house by a vote of 118-21, and the senate by a vote of 46-14, and now a coalition of more than 40 organizations, including the Tenth Amendment Center, are pushing Cuomo to sign it. The coalition, which also includes the NYCLU, sent Cuomo a letter urging him to act in August.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics