Chameleon AI masks faces from scraping while preserving image quality
If the first wave of the current AI deluge created complicated challenges, the second wave has often been about using AI to solve those challenges. ChatGPT and its ripple effect have cast “humanness” into doubt – and now we have World ID to verify who counts as human. Bots begat bot detection; so did GenAI deepfakes beget deepfake detection.
If it all feels a bit cyclical and cyclonic – if you’d rather just hide behind a mask – lo and behold, AI has the answer. Chameleon is a new AI model that generates a virtual “personalized privacy protection” mask, or P-3 mask, to protect against facial recognition.
“Chameleon learns the facial signature of the user (protectee) to generate a P3-Mask, which can be applied to protect any facial images before sharing them online against unauthorized FR,” says research from the University of Hong Kong and the Georgia Institute of Technology. The paper describes the process as follows:
“First, we use a cross-image optimization to generate one P3-Mask for each user instead of tailoring facial perturbation for each facial image of a user. It enables efficient and instant protection even for users with limited computing resources. Second, we incorporate a perceptibility optimization to preserve the visual quality of the protected facial images. Third, we strengthen the robustness of P3-Mask against unknown FR models by integrating focal diversity-optimized ensemble learning into the mask generation process.”
In effect, what this means is that Chameleon tricks facial recognition scanners to think a photo of someone is of somebody else. Cross-image optimization makes the mask fast and adaptable, while image optimization protects the quality of the facial image. The ensemble learning piece uses AI to improve the algorithm’s accuracy when it encounters new facial recognition models.
And it works: “extensive experiments on two benchmark datasets show that Chameleon outperforms three state-of-the-art methods with instant protection and minimal degradation of image quality.”
Post that perfect selfie to the ’gram without worry
The researchers, who intend to post Chameleon’s code publicly on GitHub, believe the tool can be useful protection against data scraping – harvesting face data from the public Internet for massive training datasets without users consent. They explicitly call out Clearview and PimEyes as firms that “have collected billions of online images and can recognize millions of citizens without their consent.”
While masking tools exist, they often leave clear artifacts or other distortions. By preserving image quality, Chameleon makes it possible to put an invisible mask over an image before posting it to social media, for instance, or to apply it to a headshot required for promotional purposes.
Facial authentication firms, however, need not worry: the tool allows users to grant trusted third parties access to their P3-Mask to de-obfuscate the protected image.
Article Topics
biometrics | Chameleon | data privacy | face biometrics | facial recognition
Comments