Transparency standards for AI piloted in the UK
In an encouraging development for anyone developing, deploying or being surveilled by biometric systems, a transparency standard for AI has been published by the United Kingdom government.
An office in the executive branch of the UK government has created proposed rules for “algorithmic transparency” that would apply to agencies and public-sector bodies.
AI coders are expected to use the standard to be “meaningfully transparent” about how algorithms are used in supporting decisions.
The Central Digital and Data Office intends to run pilots among a finite number of bodies, collecting feedback along the way, according to a post by the office.
Few if any digital revolution in the past has enjoyed (or suffered) the near-total lack of national guidelines that biometric recognition has experienced.
They are important because they can speed technology and product development and, in the case of transparency, at least try to address the public’s justifiable skepticism of AI’s virtues.
The standard was promised in the UK government’s National AI Strategy and National Data Strategy. It was a joint project of the Digital and Data Office and the Centre for Data Ethics and Innovation.
The standard being piloted divides attributes into two tiers, providing definitions and a template for reporting information such as how data is used, how algorithms have been trained, data protection impact assessments performed.
There are other efforts to create standards, but they are so piecemeal as to strain the term standard. The EdSAFE AI Alliance was formed this fall to bring transparency to facial recognition proctoring systems.
In the United States, where the business culture abhors standards created by government, politicians debate laws to try to accomplish the same thing, typically with less success than if public and private sectors could agree on standards.
Article Topics
AI | algorithmic transparency | algorithms | biometrics | data protection | ethics | research and development | standards | UK
Comments