Informal agreement on restricting exports of biometric surveillance technology reached by EU

The European Union has reached an agreement on limiting exports of facial recognition and other technology that can be used for surveillance, the Associated Press reports.
European companies will need to show they have met a set of requirements including human rights impact assessments in order to be granted a government license to export cyber-surveillance and ‘dual-use’ technologies.
A statement from European Parliament mentions high-performance computers, drones, and certain chemicals among potential ‘dual use goods.’
The agreement strengthens member states’ reporting obligations in the area, and members have agreed on rules for quickly including emerging technologies in the regulation. The agreement applies to a wider range of products and services than the Wassenaar regulations on dual use goods, though as expected granting or denying licenses will ultimately be the responsibility of national governments.
“Today is a win for global human rights,” says European Parliament Rapporteur Markéta Gregorová, the lead negotiator on the agreement. “We have set an important example for other democracies to follow. We will now have EU-wide transparency on the export of cyber surveillance and will control the export of biometric surveillance. Authoritarian regimes will no longer be able to secretly get their hands on European cyber-surveillance. We still do not have a level-playing field among EU countries but several new provisions allow for autonomous controls, better enforcement and coordination. I expect that member states’ obligation to uphold human rights and their own security will be the foundation of further work ahead.”
Negotiating delegation head Bernd Lange stated that some member states attempted to block the regulation updates.
Amnesty International called on the EU to tighten its export controls for biometrics and other technologies in September in a report alleging sales of digital surveillance technology to Chinese entities by three EU-based companies.
The new rules will come into effect once the informal agreement is formally endorsed by the International Trade Committee and European Parliament, as well as the EC.
U.S. tech giants urge new administration to update regulations
Meanwhile across the Atlantic Ocean, Microsoft is advising similar updates to U.S. export controls, partnering with OpenAI to propose “digital transformation” of the regulations in a blog post.
The existing restrictions are based only on performance criteria, according to the post, and fail to take into account potential dual uses of technology, or to keep up with rapid technological development.
“Taking facial recognition as an example, the same digital biometrics technology, software and hardware capture and analyze information to identify people, whether for the purpose of finding a terrorist or a missing child versus finding an oppressed dissident or minority,” Microsoft Partner and Associate General Counsel, Global Trade, writes in the post.
Microsoft and OpenAI suggest the government set policies for who can use sensitive technologies, and what for, from a national security perspective, and then enforce those policies within the technology itself. That means software features with real-time controls and tagging to ensure the same controls apply downstream, “hardware roots of trust,” tamper-resistant tools, and possibly AI techniques like those found in OpenAI’s GPT-3 neural language training model.
Beyond export controls, the same mechanisms could be applied to scalable supply chain security, corporate social responsibility, and customer-driven incentives.
As America’s government transitions to a new executive administration, NextGov reports proposals on regulation from technology companies to the incoming White House.
Microsoft President Brad Smith reiterated his call for a national law on facial recognition in a separate blog post. IBM CEO Arvind Krishna recommends creating a National Research Cloud for AI, and also said measures should be put in place to prevent the use or export of facial recognition for mass surveillance or other human rights violations. IBM also called for export restrictions targeting the type of face biometrics most likely to be used in mass surveillance systems in September.
Article Topics
biometric identification | biometrics | EU | facial recognition | IBM | Microsoft | national security | regulation | United States | video surveillance
Comments