Online tool exposes whether face biometrics have been trained with your photos
Exposing.AI is an online tool that can help users to track whether their pictures have been used to train AI face biometrics systems. The solution is featured in a recent New York Times article tracing face biometrics technology as we know it back to its earliest data sets, originating from image-sharing platforms like Flickr.
In a nutshell, Exposing.AI searches the internet for Flickr images users may have posted and compares them to existing face biometrics datasets. This, in turn, allows users to find out whether their images have been used to train facial recognition software.
The tool was developed by Surveillance Technology Oversight Project technology director Liz O’Sullivan and artist and researcher Adam Harvey. “People need to realize that some of their most intimate moments have been weaponized,” said O’Sullivan. Used properly, Expose.AI could help to increase transparency and perhaps assist in seeking greater accountability and data protection.
But rather than using face biometrics, Expose.AI itself only uses online data to pinpoint users’ posted images and link them to data sets. This restriction comes mainly as a precautionary measure to prevent the solution to be abused. Platforms such as Flickr were fertile ground for large-scale image collection due to the platform’s use of a Creative Commons license.
The readily accessible images found on Flickr presented the perfect image database and as such became the backbone of much early face biometrics development. These vast data sets allowed scientists to successfully train artificial intelligence to run accurate and efficient face biometrics checks. One prominent example of one such set was MegaFace, a previously publicly available image library used by various organizations and businesses.
MegaFace presented one of the first publicly available collections of images originally captured from dating sites, social networks, and even surveillance cameras. The writers noted that while most images were captured without explicitly given consent, the collection practices were perfectly legal at their time. The project, which was run by researchers at the University of Washington, was originally envisioned to spur open-source research of face biometrics. Yet, these vast oceans of data, which were provided free of charge, presented an invaluable gold mine for those who knew how to make the data work for them. And so, the Times writes, several other entities soon took advantage of this vast digital library.
Among MegaFace’s more than 6,000 downloads scores of private companies and government agencies from around the world. Users include Northrop Grumman, TikTok parent company ByteDance, and Chinese facial recognition unicorn Megvii.
MegaFace has since been taken offline, but this move comes after years of circulation, during which it helped shape global biometrics and surveillance initiatives.
Article Topics
AI | algorithms | biometric data | biometrics | data protection | dataset | Exposing.AI | face photo | facial recognition | training
Comments