US government committee gather evidence on biometric privacy-enhancing technologies
Biometric privacy-enhancing technologies (BPET) could play a major role in protecting the sensitive information of Americans, experts told a House Subcommittee, but both more research and investment in those technologies, and carefully crafted policies to accompany them are needed.
The Subcommittee on Investigations and Oversight, which functions under The U.S. House of Representatives’ Committee on Science, Space, & Technology hosted the hearing on ‘Privacy in the Age of Biometrics.’
Deployed at the point of data-capture, biometric privacy-enhancing technologies can ensure that features unneeded for identification are obfuscated or otherwise not collected, Chairman Bill Foster (D-IL) said. Obfuscation of biometric data so that it cannot be stolen and used elsewhere, as well as templates protected by encryption, are techniques mentioned in Foster’s introduction.
Foster wants the databases of driver’s license photos held by states to be leveraged as biometric reference data for an American national digital ID system he has co-sponsored legislation for.
Ranking Committee Member Jay Obernolte (R-CA) spoke about the need to maintain the benefits of biometrics use with regulation, without taking on additional privacy or security risk. He also spoke about a number of “misguided” legislative proposals he heard while serving in California’s legislature that would have banned facial recognition outright.
Expert witnesses on BPETs
Government Accountability Office Director of Science, Technology Assessment, and Analytics Candice Wright reviewed GAO’s recent work uncovering how facial recognition is used by federal agencies. This research yielded a set of recommendations to improve transparency and the vetting of third-party partners to ensure biometric data is properly protected.
National Institute of Standards and Technology Information Technology Laboratory Director Dr. Charles H. Romine presented NIST’s approach to supporting privacy in IT systems, including biometrics. That includes NISTS’s Privacy Framework.
Michigan State University Department of Computer Science and Engineering Professor and NSF Center for Identification Technology Research Site Director Dr. Arun Ross noted the potential value of homomorphic encryption to protect biometric data. Cancelable biometrics was explained, along with “perturbing” facial images to allow their use in identification, but the ability to extract information about characteristics like age, sex, race or health is obscured. Ross suggested face images could be made more difficult to scrape from public websites and social media profiles, and that cameras could be deployed that capture data uninterpretable by people, and useful only within a given application.
Academic researchers are generally more aware of privacy concerns than before, and building such considerations into their technology and evaluations.
“This shift in the research culture is remarkable, and bodes well for the future of the technology,” his addressed concluded.
During the question and answer phase of the hearing, Romine said that enormous strides have been recently made to bring homomorphic encryption into the practical realm, though further work remains on that front.
Ross further explained the use of mathematical functions to transform fingerprint templates in ways that make them cancelable.
Obernolte noted that privacy is not necessarily binary, in terms of presence or absence, and discussed the challenge of scope creep with Romine.
Procurement practices, the importance of the context of use to privacy considerations, the privacy rules government agencies are required to adhere to, and the consistency with which privacy impact assessments are applied were also discussed.
More research is needed into differential privacy, Ross said, but the technology could be used to enable the biometric matching of identities within a certain system that cannot be used elsewhere.
Article Topics
biometric data | biometrics | data protection | digital ID | GAO (Government Accountability Office) | homomorphic encryption | legislation | privacy | regulation | research and development | U.S. Government
Comments