FB pixel

Reflectacles develops eyewear to bypass facial recognition systems

 

Chicago company Reflectacles believes it has the solution to concerns about the widespread deployment of facial biometrics. The company is developing the IRpair and Phantom eyewear that uses special lenses and optical filters to block facial recognition, tracking and infrared facial mapping. “Both block 3D infrared facial mapping during both day and night and block 2D video algorithm-based facial recognition on cameras with infrared for illumination,” according to ChicagoINNO. As per the Kickstarter page, the glasses will be delivered in April 2020 and the company has doubled its goal by raising $34,000.

Founder Scott Urban says international customers have already shown interest in the glasses, hoping to bypass facial recognition algorithms especially at political protests.

“[It’s about] continuing the process of how to make the best way to block facial recognition but then also to make it seemingly normal as possible,” Urban said.

“You’re largely buying Reflectacles not for the purpose of committing crime or you don’t want the government to be seeing you. It’s kind of a political speech,” said Rajiv Shah, data scientist at DataRobot and adjunct assistant professor at the University of Illinois-Chicago. “You want to stand up and point the people around you that, ‘Hey, this is something we all need to think about.’”

It’s not just Americans who are working on spy-like gadgets to bypass facial recognition identification. A group of white hat researchers from Lomonosov Moscow State University and Huawei Moscow Research Center came up with a wearable card to confuse the technology, writes Synced. Their technique is called “ADvHat” and is presented in a paper called “AdvHat: Real-World Adversarial Attack on ArcFace Face ID system.”

The method was tested with full-face photos under different lighting conditions, viewpoints and facial rotation to “change the input to an image classifier so the recognized class will shift from correct to some other class.” It consists in a simple color sticker fixed to a hat to reduce accuracy by creating a raised eyebrow effect, confirming that machine learning algorithms are prone to error when exposed to adversarial examples.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Canada regulator backs privacy-preserving age assurance

The Office of the Privacy Commissioner of Canada (OPC) has published a policy note and guidance documents pertaining to age…

 

FCC seeks comment on KYC revision for commercial phone calls

The U.S. Federal Communications Commission (FCC) has proposed stronger KYC requirements for voice service providers to prevent scams and illegal…

 

Deepfake detection upgrade for Sumsub highlights continuous self-improvement

Sumsub has launched an upgrade to its deepfake detection product with instant online self-learning updates to address rapidly evolving fraud…

 

Metalenz debuts under-display camera for payment-grade face authentication

Unlocking a smartphone with your face used to require a camera placed in a notch or a punch hole in…

 

UK regulators pan patchwork policy for law enforcement facial recognition

The UK’s two Biometrics Commissioners shared cautionary observations about the use of facial recognition in law enforcement over the weekend…

 

IDV spending to hit $29B by 2030 as DPI projects scale: Juniper Research

Spending on digital identity verification (IDV) technology is projected to reach a 55 percent growth rate between now and 2030,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events