Advocates say people think of biometric identification all wrong
A digital rights organization says the popular view of biometric identification and surveillance is backward and the result is the commodification of people for government control and business profit.
Instead of viewing the tools as mere collection systems, executives with Access Now see algorithms as programmed categorizers. They impose a desired order on what is by definition a spectrum of qualities, behaviors, reactions and identities.
Biometric systems “are used to create baselines of what constitute ‘normal’ behaviors and bodies,’ ” according to the report. This “further reinforces unequal treatment of people whose bodies and behaviors do not adhere” to how those wielding the software consider normal.
And Access Now leaders say negative results in the private sector can grow when military and intelligence officials take notice of an algorithm.
They cite an article in the investigative news publication ProPublica about an Ohio police man turned entrepreneur pushed software that, it is claimed, could detect someone lying when calling 911 about a violent crime they, in fact, committed.
The accuracy of the software is questionable, but, according to ProPublica’s reporting, the FBI became interested.
As reports of success grow, increasingly dubious biometric product ideas are growing, too. Access Now cites the example of researchers who claimed criminality can be determined by analyzing facial features.
That, the group says, is directly related to the racist-influenced practice of phrenology. In the 1800s, a German physician pushed the idea that the size and shape of a person’s skull revealed intelligence and criminality.