FB pixel

Maryland law prohibits face-scanning during job interviews without consent


biometric facial recognition

Beginning October 1, it will be illegal for a company to use biometric facial recognition during the process of interviewing job candidates.

Executives might look at that legislation and wonder why it exists. Why would anyone take such a high-stakes legal risk in the first place?

Facial biometrics-based affect recognition has been promoted as a way around bias by some selling it for hiring processes, but critics contend that the technology is scientifically flawed and enforces privilege.

Proponents of artificial intelligence already spend not-insignificant time and resources defending against credible accusations of biased algorithms. Plugging AI into the hiring process, which is always fertile ground for courtroom finger-pointing, would seem to be asking for an unforced error.

Perhaps aimed at the uninitiated, Maryland’s act requires would-be hirers to get an applicant’s consent before even capturing their image. Specifically, House Bill 1202 says companies cannot create a facial template during an interview. Not mentioned are images a firm might record with their surveillance or security-badging cameras.

The legislation, briefly analyzed in The National Law Review, defines a facial template as “the machine-interpretable pattern of facial features that is extracted from one or more images of an individual by a facial recognition service.”

The legal risk of using algorithms found to be biased was discussed (subscription) last month in legal news service Law360. The piece argues that efforts to eradicate all bias in AI is unrealistic, but also not necessary.

Written by U.S. Army Brig. Gen. Patrick Huston and litigator-turned-business consultant Lourdes Fuentes-Slater, the article makes the case that executives have to recognize AI’s “propensity to have illegal or harmful impacts due to negative biases.”

Hutson is assistant judge advocate general for military law and operations within the Department of Defense. Fuentes-Slater is founder and CEO of consultancy Karta Legal LLC.

Enacting a reasonable program to mitigate the biases will go a long way in protecting early adopters, the pair wrote.

Article Topics

 |   |   |   |   | 

Latest Biometrics News


CISA, FLETC failed to protect law enforcement officers’ PII, other data

Two important arms of the U.S. Department of Homeland Security (DHS) failed to protect personally identifiable information (PII) and sensitive…


Digital Identity as a vehicle for financial inclusion

By Lance Fanaroff iiDENTIFii co-founder Across the world, governments and companies are looking for vehicles for financial inclusion. From banking…


Facial recognition and AI monitoring deployed in India’s largest railway network

India’s rail networks are adding AI and facial recognition systems to their stations, part of a growing biometrics and surveillance…


Biometrics investors pursue scale, sustainability balance

Budgets for biometrics and government digital ID programs made up the main theme of the most-read news items on Biometric…


New FaceTec CLO among avalanche of appointments in biometrics and fraud protection

New executives have been named by biometrics providers FaceTec, Pindrop and Fingerprint Cards, along with C-level appointments by Prove and…


Indonesia issues call for World Bank-backed digital identification project

Indonesia is looking for a company providing consulting services as a part of its upcoming digital transformation project backed by…


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events