FB pixel

Reflectacles develops eyewear to bypass facial recognition systems

 

Chicago company Reflectacles believes it has the solution to concerns about the widespread deployment of facial biometrics. The company is developing the IRpair and Phantom eyewear that uses special lenses and optical filters to block facial recognition, tracking and infrared facial mapping. “Both block 3D infrared facial mapping during both day and night and block 2D video algorithm-based facial recognition on cameras with infrared for illumination,” according to ChicagoINNO. As per the Kickstarter page, the glasses will be delivered in April 2020 and the company has doubled its goal by raising $34,000.

Founder Scott Urban says international customers have already shown interest in the glasses, hoping to bypass facial recognition algorithms especially at political protests.

“[It’s about] continuing the process of how to make the best way to block facial recognition but then also to make it seemingly normal as possible,” Urban said.

“You’re largely buying Reflectacles not for the purpose of committing crime or you don’t want the government to be seeing you. It’s kind of a political speech,” said Rajiv Shah, data scientist at DataRobot and adjunct assistant professor at the University of Illinois-Chicago. “You want to stand up and point the people around you that, ‘Hey, this is something we all need to think about.’”

It’s not just Americans who are working on spy-like gadgets to bypass facial recognition identification. A group of white hat researchers from Lomonosov Moscow State University and Huawei Moscow Research Center came up with a wearable card to confuse the technology, writes Synced. Their technique is called “ADvHat” and is presented in a paper called “AdvHat: Real-World Adversarial Attack on ArcFace Face ID system.”

The method was tested with full-face photos under different lighting conditions, viewpoints and facial rotation to “change the input to an image classifier so the recognized class will shift from correct to some other class.” It consists in a simple color sticker fixed to a hat to reduce accuracy by creating a raised eyebrow effect, confirming that machine learning algorithms are prone to error when exposed to adversarial examples.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometric age estimation graduates with high marks from NIST, ACCS

The inaugural evaluation of biometric age estimation technologies previewed at the Global Age Assurance Standards Summit 2024 has been published….

 

Specification to unite decentralized identity schemes opens door to new applications

The decentralized identity community is integrating OpenID’s protocols for verifiable credentials with DIDComm to give businesses a way to utilize…

 

Nametag patents method for solving user lockouts during MFA resets

Resetting a password for an account with multi-factor authentication (MFA) enabled can often lead to challenges that sometimes require users…

 

Sumsub puts numbers to deepfakes preceding elections; govts take action

An internal analysis conducted by Sumsub shows a substantial rise in deepfake content leading up to the 2024 elections in…

 

IBM loses institutional investor over biometrics sales to Israel

A major Norwegian assets management company Storebrand says it has got rid of its investments in US technology company IBM…

 

FIDO Alliance introduces passkey Design Guidelines to optimize UX

New guidance on how to implement passkeys for optimal user experience have been published by the FIDO Alliance. FIDO’s Design…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events