Tale of two nations building out their biometric surveillance capabilities – the US and China
The governments of China and the United States are using similar strategies to train their facial recognition systems. They are paying people to be subjects for biometrics algorithms.
The two examples below illustrate how comfortable people can get with being a raw building material for real-time AI-guided and analyzed surveillance systems.
They are analogous, and while not a one-to-one match, the basic similarities are notable.
Specifically, the experience of Cate Cadell, who lived and worked in China from 2014 to 2021 – the period when hundreds of millions of face biometrics camera networks ended any illusions about privacy and tolerance in that authoritarian state.
Cadell’s essay is a good read if only for marveling at how she amassed “thousands of purchase orders” for biometric surveillance systems from around China. She reports on how much control government officials wanted from AI. It is extreme.
Arguably just as significant is her memory of driving through a backwater village of dried-mud architecture where residents were lining up to stand in front of one of three phones strapped to tripods. Each person, at least one of whom was 100 years old, were brusquely told to pose.
Some with a paper photo of an anonymous person (eyes and nose cut out) over their face, some moving their heads as if maybe their faces were being scanned by radar.
From her essay it seems as though everyone was enthusiastic about being part of an AI project even though probably few in the village had a phone. Maybe patriotism, maybe novelty but most likely because everyone who participated got a ticket they could trade for cooking oil or pots and pans.
From a Western perspective, the government’s project seems cheap on multiple levels, and the villagers seem naïve or like they have heard enough about blanket AI surveillance to accept it willingly as the price to pay for a safe and stable society. That is how Beijing plays it, at least.
Then comes a local television report about the Oak Ridge National Laboratory, an organization has long used patriotism to get the support and labor of its home state, the mostly rural Tennessee.
The lab is looking for volunteers this month for a program called Biometric Recognition and Identity at Altitude and Range (BRIAR). Researchers in the year-old exercise want to develop AI algorithms that can make whole-body biometric identification from far away and from a platform in the air.
The platforms could be thought of as watch towers or drones.
Participants will be paid with gift cards of up to $150.
The obvious connection to make is that volunteers will be making it easier to find, identify and if necessary, apprehend terrorists. In largely conservative Tennessee, that would be a very acceptable explanation and motivation to participate.
Both populations can doubtless sleep well knowing that their assumptions are at least partly correct.
But, as Cadell points out in her essay, she later found a $370,000 purchase order for facial recognition surveillance systems to be deployed in the little village. The order had been approved 18 months prior to her serendipitous stop on her cross-country drive.