AI ethics keeps getting more complex and surprising
Talk about international curbs on face biometrics typically ignore two massive areas — China and Africa.
As China continues to use and sell facial recognition systems on scale that is unprecedented on Earth, regulation is not something that gets meaningful debate.
And Africa continues to suffer the narrowmindedness of governments and industries in developed economies. An article in The Conversation touching on AI development on the continent lists three AI and machine learning programs underway in African nations when most people living north of the equator would be surprised that any work at all is being done there.
But China surprised AI ethicists worldwide this week by endorsing draft United Nations recommendations intended to, among other things, convince signatory countries to ban AI for social scoring and mass surveillance.
The draft guidelines were favored by all 193 countries in UNESCO (the U.N. Educational, Scientific and Cultural Organization) including China, according to reporting by Politico. The motivation is to create ethical norms for making, operating and selling AI generally and facial recognition particularly.
China’s autocratic government pioneered the use of big data and face biometrics to give a social score to each of its 1.4 billion citizens, an ongoing campaign. The scores are like credit scores except that they rely on observation and analysis of far more of a person’s life.
The same day that China approved the draft language, two academics with South Africa’s University of the Witwatersrand scolded the world for not including Africa’s thoughts on ethical AI. Their white paper on the topic is here.
The pair pointed out how complex and necessary it will be to create a global framework for using facial recognition.
The European AI4People framework, for instance, is built from six frameworks, but not one that holds communitarian values over those of autonomy, which are, obviously, central to developed economies.
Not even lip service is being paid to this blind spot, according to the authors.