Buolamwini, Gebru and Raji win AI Innovation Award for research into biometric bias
Biometrics researchers Joy Buolamwini, Timnit Gebru, and Inioluwa Raji have won the AI for Good category in VentureBeat’s AI Innovation Awards for their research into algorithmic bias in facial recognition.
Retail robot provider Bossa Nova won the Business Application category for its robot workers, which use computer vision and facial recognition software, plus RGB photos and point clouds, to carry out inventory management tasks. They are already working in dozens of Walmart locations, according to VentureBeat, stocking shelves at 0.4 MPH.
In the NLP/NLU (natural language processing/natural language understanding) Innovation category, Denmark’s Corti won for its pattern recognition system which identifies when emergency calls involve a cardiac event. The technology allows such events to be understood more accurately than by human operators alone, and 30 seconds faster.
The Computer Vision category was won by Vue.ai, which enables retail customers to virtually try on clothes. Xnor took the Startup Spotlight award for its “AI everywhere, on every device” technology, which includes the AI2Go platform, offering optimized, prebuilt on-device AI models to deliver deep learning at the edge on low-power and inexpensive devices.
There were four different nominees in each of the five categories for the awards, which were handed out at VentureBeat’s Tranform 2019 event.
Buolamwini and Gebru kicked off a debate which has consumed large amount of attention in both the biometrics industry and media with an academic study of data sets as a cause of algorithmic bias last year. A subsequent study by Buolamwini and Raji showed significant improvements in some systems, but also that significant discrepancies in system accuracy for different demographics persist. The research was panned by Amazon as a misuse of facial analysis software to perform facial recognition, but a group of prominent AI researchers backed the study, and called for the company to stop providing facial recognition to police.