US military group wants weaponized deepfakes, better biometric tools
At least some in the U.S. military have heard enough about deepfakes and they want in.
Investigative-news publisher The Intercept has got hold of a lengthy technology wish list that its editors feel was created by the U.S. Special Operations Command. Two items in the document are biometric in nature.
The command, most often referred to as SOCOM, performs the United States’ most secret and daring military missions. And officers want to add the ability to create and deploy deepfakes against those outside the country.
They also want to better their game when it comes to biometrically identify individuals using, among other techniques, touchless fingerprint capture over long distances and in all environments. Officials also want rapid handheld DNA collection gear. This can be found in the document above under 22.214.171.124 Biometrics.
In all cases, SOCOM wants to cut false positives and the ability to compare scanned biometrics against watch lists on handheld devices or remote databases. Those handhelds will need to perform all common biometric analyses, including DNA comparisons.
But the showstopper is the unit’s deepfake ambitions (at 126.96.36.199. Military Information Support Operations in the document). The leaders of many advanced economies, including various agency heads in the United States, have publicly stated their wariness of deepfakes.
Many feel military deepfakes belong in a category of weapon that by their nature cannot be reliably controlled (in some cases, conceivably) once unleashed. There is no end to the scourges that could result, which could include rape, biological and chemical attack and nuclear bombs.
Some military officers and military experts think deepfakes can be interpreted as at least partly illegal according to international laws of war. They likely run afoul of article 37 of the Geneva Conventions, prohibiting perfidy.
One common example of perfidy is pretending to want to surrender. So is a soldier pretending to be wounded or to be a civilian.
It is less clear in situations where nations or even soldiers on the battlefield might use a deepfake to convince civilians that a particularly heinous attack is coming, creating panic at the least.
A case can be made that perfidy has occurred in Ukraine, where a deepfake of the country’s president appeared, telling his nation to stand down in their defense against Russia’s invasion. It has been widely reportedly that Russian troops have tried to pass themselves off as civilians.
Brigham Young University law professor Eric Talbot Jensen wrote about this topic three years ago and decided, “Deepfakes present an inevitable innovation” in war making.
In his analysis for the scholarly publication Articles of War, Jensen’s suggestions are few.
The international community has to judge which uses of deepfakes are illegal in war. And military leaders have to find uses for deepfakes that are safe for civilian populations.