Canada border agency investigating possible bias of biometric airport kiosks
Automated biometric primary inspection kiosks (PIKs) at Canadian airports may have higher error rates when processing people from certain ethnic backgrounds, according to internal communications by the Canada Border Services Agency (CBSA), according to the CBC. CBC News has obtained a report from Canada’s government on “Facial Matching at Primary Inspection Kiosks,” along with emails from a CBSA evaluation team discussing false match rates at the kiosks.
The documents were heavily redacted before being released under Access to Information laws, and a representative of the agency declined to discuss the report, citing national security interests.
Travelers to Canada from the Middle East, Africa, and the Caribbean are selected for secondary inspection processes at far higher rates than those from the U.S. or Western Europe, CBSA research indicates.
In emails between evaluation team members, one individual refers to reports of differing accuracy for facial recognition systems when matching people with dark skin, acknowledging that research indicates it is an issue within the industry, despite an initial impression that “maybe it was just the press making a fuss.” Another team member then shared a link to MIT research showing racial bias in leading facial recognition technologies.
Of roughly 2 million secondary customs inspections studied by CBSA, only 140,000 were ordered at the discretion of human agents, while the rest were automatic, and mostly came from the automated kiosks, and human officers overrule the machines on about 6 out of every 10 referrals. Automated points of contact also made 88 percent of immigration inspection referrals.
Despite the misgivings, more biometric primary inspection kiosks were recently launched in several Canadian airports. The deployment of more than 16,000 biometric eGates and kiosks is expected to generate $1.3 billion in revenue over the next five years, according to Acuity Market Research.
Article Topics
accuracy | airports | algorithms | APC kiosk | biometric-bias | biometrics | Canada | facial recognition
There is more information needed before anyone could draw any viable conclusions about potential bias in face recognition systems from these statistics. There are many reasons apart from biometric mismatch or failure to acquire for which a passenger may be referred for secondary processing. It may be because of biosecurity reasons related to the country of departure, it may be that more controlled substances are found coming into the country by passengers via that region, and therefore more are subjected to secondary processing. Not all e-passports are manufactured to the same quality and, it may be that their countries e-passport chips fail more often than other countries.
Passengers may select using the automated PIK channels over manual counters thinking that these will provide lower opportunity for investigation by a border security personnel, only to be referred based on country of origin or country of transit or any other immigration rule. These passengers would have been referred at either the ABC gates or the manual counters regardless, because of the business rules, not the biometrics. NIST is producing a report on any potential bias caused by AI in face recognition algorithms and it will be interesting to wait for that report before any conclusions are drawn.
Actual border checkpoint figures exclusively for biometric referral are what is needed and in some cases these show it is older white haired or bald males that may be the highest failure group, not passengers with dark coloured skin. It may also depend entirely on the lighting and environmental conditions of the e-gates at that particular border checkpoint.