Fashion and makeup designed to throw off biometric algorithms
Surveillance Technology Oversight Project (STOP), Executive Director Albert Fox Cahn believes creative cosmetics and facial make-up can “throw off many of the (existing facial recognition) algorithms.”
STOP’s mission is “to end discriminatory surveillance” by “challeng(ing) both individual misconduct and broader systemic failures” through the “craft(ing of) policies that balance new technologies and age-old rights.” The group “litigates and advocates for privacy, fighting excessive local and state-level surveillance” and “highlights the discriminatory impact of surveillance on Muslim Americans, immigrants, and communities of color.” Its vision is “to ensure that technological advancements don’t come at the expense of age-old rights,” and “to transform New York City and state into models for the rest of the United States of how to harness novel technologies without adversely impacting marginalized communities.”
STOP is pursuing litigation under New York’s Freedom of Information Law to compel the New York City Transit Authority to produce documents requested about the “apparent installation of facial recognition surveillance in the Times Square/Port Authority Subway Station.” The group “originally requested the documents on April 19th, 2019, following a tweet from New York Times Employee Alice Fung, revealing a Wisenet brand display with yellow squares drawn over the faces of transit riders as they entered the station. The MTA publicly denied that the Wisenet brand monitors had facial recognition capability, but they then refuse to provide any of the documents that would confirm their claims.”
The litigation is ongoing.
Cahn also says technology can be both “very powerful but also very fragile,” and explained that “(e)ven placing small amounts of make-up could trick an algorithm into finding no match or a match with a different face.”
“This includes blocking out shapes in geometric patterns or obscuring key features such as the eyes or nose-bridge or experimenting with Juggalo clown make-up,” the organization’s website says.
Privacy and civil rights activists and protestors increasingly have been experimenting with different and varying forms of paints, make-up, and prosthetics in attempts to see how well they can defeat, or at least confuse facial recognition AI algorithms.
This week, the CV Dazzle – short for “computer vision dazzle” – website, for example, announced it anticipates scheduling the release of its latest purported biometrics defeating software at the Haus der elektronischen Künste Basel (HeK, House of Electronic Arts Basel) event in April with assistance from a “privacy grant” from NL Net, a group that provides financial grants to “supporting organizations and people that contribute to an open information society” who have “ideas to fix the Internet.” A search of the HeK website found no listing of a grant to Harvey or CV Dazzle, but, grants for its Privacy & Trust Enhancing Technologies program will not be announced until April 1, according to the group.
The announcement of a new CV Dazzle likely will be made during the HeK Making FASHION Sense, which showcases the radical transformation of fashion through technology. The event is described as an “exhibition (that) explores technology as a transformative tool for artists, designers, as well as for the wearers of clothing, generating a reinvention of fashion systems. While hyper-functional materials already monitor our biometric data in everyday life and sports activities, this exhibition showcases artists and designers who develop experimental augmented fashion objects, investigating new perceptions of our environment and human interaction which make us think in new ways. Using new materiality, they create creative fashion processes that stimulate the human senses, perceive the wearers and their surroundings, change our perspectives, and make sense in the current geopolitical context.”
HeK is an event described as “dedicated to digital culture and the new art forms of the information age … a place for creative and critical discourse on the aesthetic, socio-political and economic impact of media technologies.”
Artist turned privacy rights activist Adam Harvey created CV Dazzle as an “open-source anti-facial recognition toolkit” – originally developed as a Masters thesis project while at New York University in 2010 – which ostensibly “allows users to explore how fashion can be used as camouflage from face-detection technology,” according to the CV Dazzle website, which states “CV Dazzle explores how fashion can be used as camouflage from face-detection technology, the first step in automated face recognition.”
The site explains that “The name is derived from a type of World War I naval camouflage called Dazzle, which used cubist-inspired designs to break apart the visual continuity of a battleship and conceal its orientation and size. Likewise, CV Dazzle uses avant-garde hairstyling and makeup designs to break apart the continuity of a face. Since facial-recognition algorithms rely on the identification and spatial relationships of key facial features, like symmetry and tonal contours, one can block detection by creating an ‘anti-face.’”
But these movements have also launched a new wave of technological countermeasures, both in the private sector as well as within the U.S. military.
The CV Dazzle site now states that it “will be relaunched in 2020” because “much of the content here was developed for the haar cascade face detection algorithm, which was widely used between 2010-2015, but has now been deprecated as neural networks face detection algorithms have become more widespread.”
According to Harvey’s website, his OpenCV Face Detection tool (OpenCV is based on the Viola-Jones algorithm) had been “one of the most widely used face detectors” using an “algorithm (that) performs best for frontal face imagery and excels at computational speed. It’s ideal for real-time face detection and is used widely in mobile phone apps, web apps, robotics, and scientific research.”
Recently, the Ludwigshafen, German-based software company trinamiX GmbH, a subsidiary of BASF SE, developed skin-detection technology which uses Beam Profile Analysis for mobile security purposes to defeat the very sorts of novel methods privacy rights activists, and even hostile governments and criminal organizations have used to some success as mentioned above.