Ever AI under fire for using photo app images to train facial biometrics without adequate transparency
Photo storage app Ever used images uploaded by customers to create albums in the cloud were used to develop Ever AI biometric facial recognition technology, sparking allegations that the company violated user privacy, NBC News reports.
Ever AI says that it draws on a private dataset of 13 billion photos and videos to train its technology. The company also touted the accurate performance of its algorithms in NIST FRVT testing as making it a leading Western challenger to AI industry leaders in China and Russia.
NBC News notes that Ever AI markets its facial recognition to law enforcement agencies, which has also generated significant controversy for several companies, though its contracts so far are all with private companies.
Aley told NBC News that he joined the company in 2016, and that it shifted its focus to facial recognition shortly before its successful $16 million 2017 funding round.
How the data sets used to train and test AI systems are gathered has come under increasing scrutiny recently, with previous reports suggesting images used to train facial recognition are often collected without permission, and public revelations about the annotation of Alexa recordings.