Amazon dismisses ACLU criticism of its facial recognition technology with totally different test results
Amazon has responded to a recent blog post by the American Civil Liberties Union (ACLU) in which it matched the facial images of 28 members of U.S. Congress with publicly available mug shots with a blog post of its own, claiming that the default confidence setting used by the ACLU is inappropriate to the use case, and that it misinterpreted the results.
Amazon Web Services General Manager of Deep Learning and AI Dr. Matt Wood said in the post that a confidence threshold of 99 percent is recommended for the kind of comparison the ACLU was performing, and is the level the company recommends law enforcement use. Performing a similar experiment with the confidence threshold at 99 percent, and using a data set with more than 30 times the faces for comparison, Amazon found that its Rekognition misidentified no members of Congress.
Wood also emphasized that Recognition is “almost exclusively” applied to narrow the range of options in a given case, rather than make autonomous decisions, and as a cloud-based machine learning application, Rekognition is constantly improving.
He noted that the facial database the ACLU used may itself have skewed the results, though this would appear to beg the question, as no assurances are given about how Amazon ensures its customers do not make the same mistake. Wood refers to recent research from NIST that concludes that facial recognition technology, which Wood says is far behind Rekognition, can outperform human facial recognition. NIST also found that the highest accuracy judgements were made by a combination of human experts and algorithms.
Wood does not suggest that governments should act as watchdogs for the development of the technology as a law enforcement tool, despite some media claims, but rather that it makes sense for governments to specify the confidence threshold that law enforcement agencies use.
The post is the second from Wood in response to a controversy that has included calls from rights groups and Amazon shareholders and employees to stop marketing the facial recognition technology to law enforcement agencies. The company recently responded that it will not make any such changes.
Article Topics
accuracy | ACLU | biometric matching | biometrics | facial recognition | law enforcement
All biometrics should be tested by an objective, standards-based 3rd party! Vendors should *not* be the performance verification authority for their own products. We’re in the *security* business, not opening up another dollar store. Security decisions can clearly have far-reaching, potentially disastrous outcomes, and neither the ACLU nor Amazon, in this case, should be making these evaluations, particularly without any testing and performance standards for guidance. Frankly, watching this unfold is mind boggling.