Professor wins grant from privacy commish to tackle facial recognition bias in Canada

Dr. Gideon Christian has been awarded a $50,000 grant from Canada’s Office of the Privacy Commissioner Contributions Program for his research project, “Mitigating Race, Gender and Privacy Impacts of AI Facial Recognition Technology” to identify issues in the development and deployment of facial recognition in the country, according to an article from the University of Calgary.
“There is this false notion that technology, unlike humans, is not biased,” says Christian, PhD. “That’s not accurate. Technology has been shown (to) have the capacity to replicate human bias.”
Christian’s work will focus on private sector development and deployments of facial recognition in Canada. He will also examine the racial bias embedded in various forms of facial recognition and develop a framework to address concerns with the effects of facial recognition.
There have already been a number of cases involving wrongful arrests based on facial recognition in the U.S. Some offices have also been implementing the software disproportionately on cases identifying Black faces, with bans in cities like New Orleans being repealed.
Christian found that in Canada, “Black women, immigrants who have successfully made refugee claims, (have had) their refugee status stripped on the basis that facial recognition technology matched their face to some other person.”
A test of demographic differentials by NIST back in 2019 found that some algorithms produce higher error rates for people with darker skin, while the best-performing algorithms returned nearly even error rates for different groups. A follow-up evaluation in 2022 showed significant progress towards indistinguishable performance by the most accurate facial recognition algorithms.
Christian cites an error rate for Black women from Joy Buolamwini’s Gender Shades report, which was conducted with facial analysis algorithms, rather than biometric identification algorithms, a year before NIST’s first demographic evaluation.
“Racial bias is not new,” says Christian. “What is new is how these biases are manifesting in artificial intelligence technology… This particular problem with this technology, if unchecked, has the capacity to overturn all the progress we achieved as a result of the civil rights movement.”
Article Topics
accuracy | biometric-bias | biometrics | biometrics research | Canada | facial recognition
Comments