FB pixel

Maryland law prohibits face-scanning during job interviews without consent

 

biometric facial recognition

Beginning October 1, it will be illegal for a company to use biometric facial recognition during the process of interviewing job candidates.

Executives might look at that legislation and wonder why it exists. Why would anyone take such a high-stakes legal risk in the first place?

Facial biometrics-based affect recognition has been promoted as a way around bias by some selling it for hiring processes, but critics contend that the technology is scientifically flawed and enforces privilege.

Proponents of artificial intelligence already spend not-insignificant time and resources defending against credible accusations of biased algorithms. Plugging AI into the hiring process, which is always fertile ground for courtroom finger-pointing, would seem to be asking for an unforced error.

Perhaps aimed at the uninitiated, Maryland’s act requires would-be hirers to get an applicant’s consent before even capturing their image. Specifically, House Bill 1202 says companies cannot create a facial template during an interview. Not mentioned are images a firm might record with their surveillance or security-badging cameras.

The legislation, briefly analyzed in The National Law Review, defines a facial template as “the machine-interpretable pattern of facial features that is extracted from one or more images of an individual by a facial recognition service.”

The legal risk of using algorithms found to be biased was discussed (subscription) last month in legal news service Law360. The piece argues that efforts to eradicate all bias in AI is unrealistic, but also not necessary.

Written by U.S. Army Brig. Gen. Patrick Huston and litigator-turned-business consultant Lourdes Fuentes-Slater, the article makes the case that executives have to recognize AI’s “propensity to have illegal or harmful impacts due to negative biases.”

Hutson is assistant judge advocate general for military law and operations within the Department of Defense. Fuentes-Slater is founder and CEO of consultancy Karta Legal LLC.

Enacting a reasonable program to mitigate the biases will go a long way in protecting early adopters, the pair wrote.

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

MOSIP delves into biometric data quality considerations

Biometric data quality was in focus at MOSIP Connect 2026 in Rabat, Morocco, from policies for ensuring good enrollment practices…

 

NIST nominee pressed on AI standards, facial recognition oversight

The Senate Committee on Commerce, Science and Transportation on Thursday considered the nomination of Arvind Raman to serve as Under…

 

Trulioo’s Hal Lonas on how he applies aeronautics principles to fighting fraud

Rocket science is routinely held up as the ultimate example of a highly complex discipline. But Trulioo’s Hal Lonas found…

 

Vouched donates MCP-I framework to Decentralized Identity Foundation

An announcement from Seattle-based Vouched says it has formally donated its Model Context Protocol – Identity (MCP-I) framework to the…

 

California’s OS-based age verification law challenges open-source community

California’s new online safety bill, AB 1043 (the Digital Age Assurance Act), adopts a declared age model for operating systems….

 

87% of failed biometric verifications in Southern Africa due to AI spoofing: Smile ID

A new report spotlights deepfake fraud posing an acute problem for Africa. Digital identity, banking and e-government are being used…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events