Buolamwini responds to Amazon as police admit facial recognition recommendations not followed
MIT Media Lab researcher Joy Buolamwini has responded to criticisms of her study on algorithmic bias in services offered by Amazon and other companies, while comments from police using Amazon Rekognition and Microsoft President Brad Smith are likely to further inflame the simmering public debate about the use of facial recognition by law enforcement.
“We do not set nor do we utilize a confidence threshold,” a Washington County Sheriff’s Office (WCSO) Public Information Officer told Gizmodo, despite Amazon assertions that law enforcement clients should use a threshold of 99 percent.
Buolamwini directly addresses several points made by AWS General Manager of Artificial Intelligence Dr. Matt Wood, as well as other criticisms, in a lengthy Medium post.
The first point of the rebuttal is a reiteration of the potential for abuse that even the most accurate facial recognition technology has. This potential includes the use cases that support mass surveillance, or discrimination of certain groups, Buolamwini argues. To Wood’s assertion that the study results are misleading because they are based on facial analysis rather than facial recognition technology, she contends that all systems analyzing faces need to be assessed for harmful bias, and that the two technologies are related in a very significant way.
“The failure to even detect faces of color in the first place has been a major problem for studies around facial analysis technology, because often these studies are based on results on faces that were detected,” she writes.
The lack of information about the dataset used by Amazon for its algorithm tests, and in particular a lack of information about its demographic breakdown, prevents assessment of its bias, or lack thereof, according to Buolamwini. She also notes that while confidence scores can be helpful, they can simply indicate confidence in a falsehood, and that NIST research shows that those who use facial analysis often use the default settings. The response says Amazon’s assertion that its newest algorithm was not used ignores the fact that older versions are sometimes used, and Buolamwini asks what the adoption rate is for the new version. To criticisms that attribute or gender classification is not relevant to law enforcement considerations, she points out that it can be used to narrow search fields, and that NIST even provides information about how the technology can be used by police.
The WCSO Public Information Officer later clarified that the WCSO uses Rekognition only for leads, and that decisions are ultimately made by officers. Gizmodo also reports that a Clackamas County Systems Project Analyst said in correspondence made public through an ACLU freedom-of-information request that Rekognition “documentation is very lacking or wrong.”
In a statement, an Amazon spokesperson wrote that the WCSO use of Rekognition is a good example of the technology’s capacity for helping law enforcement, and that it essentially performs the same process as was previously done much more slowly and less accurately by officers.
“From the start, the Washington County Sheriff’s Office has been fully transparent about its use of the technology and the policies that govern it, and it has engaged in ongoing, open dialogue with local legislatures, representatives, and the general public around its use,” the company statement says. “During the two years of its use, there has never been a single reported complaint from the public and no issues with the local constituency around their use of Rekognition. We are proud to work with all our law enforcement customers, including the Washington County Sheriff’s Office, in partnership with the public and local governments to aid investigations of dangerous crimes, assist law enforcement, and improve public safety.”
Smith, meanwhile, told Business Insider that preventing government agencies from using facial recognition is going too far.
“I do not understand an argument that companies should avoid all licensing to any government agency for any purpose whatsoever,” Smith says. “A sweeping ban on all government use clearly goes too far and risks being cruel in its humanitarian effect.”
While he recognizes risks associated with the technology’s use, particularly by law enforcement and certain governments, Smith cites its use in diagnosing rare diseases, such as DiGeorge syndrome. He also refers to the use of facial recognition by New Delhi police to attempt to find 5,000 missing children, but local reports and Delhi’s High Court have called the project’s effectiveness into question.
Article Topics
Amazon | biometrics | facial recognition | Joy Buolamwini | Microsoft | MIT | police
Comments