Bias, blind spots and bad identity systems
This is a guest post by Emma Lindley, Co-Founder of Women in Identity and Chief Commercial Officer at Trust Stamp.
Moving towards a future where all people are able to prove their identity in a way that empowers, rather than limits is dependent on the efforts and diverse understanding of those that shape the systems that allow us to use identity as a tool. To do this, we need to address bias from inside the identity community.
Everyone Is Biased
Bias is a tendency, inclination, or prejudice toward or against something or someone. Biases are implicit or they can be acquired sometimes as part of our upbringing or life experiences. Whatever the type, humans are all biased in one way or another. One of the benefits of cognitive bias is that may help people make quicker decisions, one of the downsides is that those decisions aren’t always accurate.
The academic world has paved the understanding for how bias affects our ability to gather information and make objective decisions that best support our intended goals.
But what effect does bias have on product development? Many of us have seen the soap dispenser that doesn’t dispense soap for dark skin and the Google blog on initially building You Tube for right-handed people only. These examples provide insight as to how bias does affect product development. But what could happen when developing identity systems?
When bias affects identity systems
Identity technology is fast becoming the standard way for low-friction, identity proofing, and authentication for a breadth of uses, from bank accounts to healthcare. As the presence of these systems becomes ubiquitous, the potential issues for unaddressed ingrained biases affects the growing base of users that rely on the services they guard increases significantly.
How many of us in the industry have had to worry about proving who we are?
There are 1 billion people that do not have basic documents like birth certificates that many of us take for granted, and if we look at authentication i.e. something you are, something you have, and something you know, each of these factors can be implemented in ways that pose barriers to wide groups of people based on access to technology, race and gender, financial and social accessibility, physical ability, and neurodiversity.
The Americans with Disabilities Act protects the civil rights of people with disabilities across public and private sectors with very few exceptions. The most recognized portions protect rights to participate in all aspects of society, and have resulted in material requirements for accessibility like curb ramps and elevators. Thirty years after the landmark legislation opened doors for millions of Americans with disabilities and set the precedent of equal accessibility, the paradigm of barriers to access has shifted.
What happens when the ability to access a bank account no longer depends on the presence of a ramp to enter the bank, but the ability to recall a complex password or scan a fingerprint? Digital tools have greatly expanded access to some excluded groups, but they do so at the burden of other groups. When programs that are meant to foster independence require additional assistance to access, the original intent has been overlooked and we further disenfranchise large portions of already marginalized populations.
Unintended bias in product development can appear in many forms. These examples illustrate the need to avoid heuristic logic in decision making and highlight the need to understand and empathize with intended and unintended users.
How do we tackle bias when developing identity systems?
Efforts are being made to understand the issues of bias in biometrics, such as the recent report from NIST. In addition, new standards are being developed to establish a framework that helps system designers meet varying consumer needs in biometric applications. Key points address the accessibility of certain biometric modality implementations and their safety implications for varying populations.
While the need for universal design in biometrics is critical, this is just one part and it does not solve the comprehensive set of issues across all identity systems.
The industry has excelled at solving problems through technical innovation, but bias is a result of the hardest system to change, human nature. It needs to be addressed with a multifaceted approach. This means growing diverse teams that represent a variety of people and perspectives, facilitating testing and research that brings an understanding and empathy of users into design thinking, calling out bias when it’s seen and creating organisations which allow that kind of open dialogue around the issue of bias and it’s impact.
The effects of biased systems compound and amplify existing disparities, and although the issue often begins with no ill-intent, unconscious bias that remains unchecked poses an existential threat to equality. Correcting bias in identity systems is just one component of a grander movement, but it is an essential brick in the foundation. Deliberately inclusive, socially responsible innovation that is built on a diverse understanding of needs is critical today.
About the author
Emma Lindley is Co-Founder of Women in Identity and Chief Commercial Officer at Trust Stamp.
DISCLAIMER: Biometric Update’s Industry Insights are submitted content. The views expressed in this post are that of the author, and don’t necessarily reflect the views of Biometric Update.
Article Topics
accuracy | algorithms | biometric-bias | biometrics | digital identity | ethics | identity management | identity verification | research and development | standards
Comments