FTC declares facial recognition surveillance tech dangerous, warns against federal privacy pre-emption
The U.S. Federal Trade Commission has come out swinging against facial recognition, issuing a proposed settlement with Paravision which Law Street reports to be its first focused on misuse of the biometric technology, and taking a position against federal privacy laws pre-empting existing state legislation.
FTC Commissioner Rohit Chopra declared that “Today’s facial recognition surveillance technologies are discriminatory and dangerous” in a Twitter announcement of the settlement, and his statement on the proposed settlement of a complaint about Paravision’s use of photos uploaded to the Ever app to train its face biometric algorithms says lobbyists in Washington are attempting “to delete state data protection laws.”
Chopra describes the move as “an important course correction,” making reference to past agency settlements with Google and Facebook which allowed them to retain their facial recognition algorithms and other technologies “enhanced by illegally obtained data.”
In stark contrast to the above tech giants, the FTC says it has ordered Paravision to “delete the facial recognition technologies enhanced by any improperly obtained photos,” as well as all photos and videos of Ever users who deactivated their accounts, and derived face data.
The FTC alleges that Everalbum, which operated Ever and owns Paravision, suggested to users in 2018 that a facial recognition tagging feature of its app would not be applied unless they opted in, but kept it on by default for users, other than those in Illinois, Texas, Washington and the EU, until April 2019. Further, the FTC says there was no opt-out available users in jurisdictions other than those above, which have data privacy laws different than those that apply throughout the rest of the U.S.
“The FTC Consent Order reflects a change that has already taken place. The Ever service was closed in August 2020 and the company has no plans to run a consumer business moving forward. In September 2020, Paravision released its latest-generation face recognition model which does not use any Ever users’ data. The consent order mirrors the course we had already set and reinforces a mindful tone as we look ahead,” a Paravision representative told Biometric Update in a statement over email.
“Face recognition and computer vision technology have the potential to improve our lives in profound ways and we take the gravity of its impacts extremely seriously. Paravision has been repeatedly recognized by the U.S. Government through NIST as the most accurate provider of face recognition from the U.S., UK, and Europe. We look forward to maintaining this position with our latest generation model, and are deeply committed to the ethical development and use of this technology.”
Under the agreement, Everalbum would be prohibited from misrepresenting its data collection and use practices, on penalty of $43,280 per violation. Otherwise, the company is not required to pay a penalty, which Chopra calls “unfortunate.”
The letter also refers to facial recognition technology as “fundamentally flawed” and says it reinforces bias.
The commission voted 5 to 0 to issue the proposed settlement and to accept a consent agreement with the company. The agency will publish a description of the agreement for public comment, and after 30 days decide whether to make the proposed consent order final.
algorithms | biometric identification | biometrics | data protection | facial recognition | FTC | Paravision | privacy | research and development | United States