Facebook disables feature after algorithm racism incident
Facebook has apologized and disabled an artificial intelligence algorithm said to be at the origin of a popped-up message labelling some Black men as ‘primates’ in a recent video published by Britain’s The Daily Mail, according to an article by The New York Times.
The tech giant recently apologized for the automated suggestion based on the video, which dates from June 27, 2020, calling it “an unacceptable error.” The recommendation feature uses object recognition, and its failure to recognize people may be based on faulty biometrics-based face detection.
Facebook also assured that it is working to make sure no similar scenario ever occurs again. Facebook reportedly has one of the biggest databases, if not the largest, for facial photos uploaded by users, with which it can train its object and facial recognition algorithms.
As described by the Times, the video in question featured footage of Black men seen in a brawl with some white police officers and civilians when suddenly, an automated message popped up below the clip reading “keep seeing videos about primates?”
The incident has since been sharply criticized. The New York Times quoted a Facebook spokeswoman Dani Lever as expressing regrets over the development: “As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
Meanwhile, the New York Times, in its report, recalls some previous biometrics-related instances in which Facebook, as well as other big tech companies such as Google and Amazon, have been roped into racism-related controversies. Facebook, for instance, has been urged in the past to look into the issue of biases in the training of its facial recognition algorithms especially for Black faces and faces of color.
Apart from a Google photos incident in 2015 where some photos of Black men were labelled as ‘gorillas,’ the Times article mentions an internal racism incident in 2016 related to the Black Lives Matter phrase in the United States, and another recent case involving Facebook-owned company Instagram where three players of England’s squad at the 2020 European Championship suffered racial denigration on the network for missing penalty kicks.
Earlier this year, Facebook announced the development of a new self-supervised computer vision model expected to improve the accuracy of its object recognition.
The company recently joined the ID2020 Alliance Technical Advisory Committee to make its own contributions toward building better standards for digital identity.