FB pixel

Peace sign can be used as a weapon against your identity

 

This is a guest post by Robert Capps, VP of business development at NuData Security

It seems that nothing is sacred. Not even the symbol of peace; proving that hackers will wreck just about anything. Last week, a Japanese researcher claimed that he could steal fingerprints from selfies, warning us all that flashing our fingers in photos can imperil your identity.

Isao Echizen is a professor at the Digital Content and Media Sciences Research Division of the National Institute of Informatics and, in an interview with Sankei Shimbun, a Japanese newspaper, revealed that he had successfully lifted fingerprints from photographs of exposed fingers.

This may sound like something out of an episode of CSI, where the zoom and enhance technique is frequently used, but modern day cameras – including those on your mobile device – are capable of capturing shots that could be a goldmine for criminals wanting to steal your identity.

Echizen specifically warned against the popular peace sign pose, as this could easily give any would-be identity thieves the opportunity to match fingerprints with other physical biometric markers, such as a photo of the iris/retina.

Echizen’s right, and we should take heed of his warning. It’s a fact that we shed physical biometric data wherever we go; leaving fingerprints on everything we touch, posting selfies on social media and videos with friends and family. Much of this information can be captured by fraudsters. Fingerprints can be stolen from doorknobs and glass and easily replicated. Rather worryingly, there are even sites on the internet that will give you a step by step guide on how to do this.

High-resolution photos, as Echizen demonstrates in this zoom-and-enhance technique can take a picture from great distances that can be used to copy a physical biometric. This technique was also brought to wide-scale attention by Jan “Starbug” Krissler when he used Angela Merkel’s photo to unlock an iris biometric test at a security conference in 2015.

While physical biometrics will always have a place when it comes to in-person user authentication, consumers bear additional risk in using physical biometrics online. The most significant of these risks is that the data, whether it’s a photo, a scan, or even a DNA test, becomes a static identifier that can never be changed. In their digital form, physical biometrics can be stolen, traded, and potentially reused to impersonate the legitimate user.

Once biometric data is stolen and resold on the Dark Web, the risk of inappropriate access to a user’s accounts and identity will persist for that person’s lifetime. As the most stringent of authentication verifications deploy physical biometrics, such as immigration and banking, physical biometric data will become very desirable to hackers.  We can expect more creative attempts by hackers to capture this information.

So, what would happen if this type of biometric theft occurs? If someone steals your password, you can change it to a new one. But you can’t change your biometric identity – so what then?

There is a bright side! We don’t have to live in a world where people can steal our unique biometric data to access our accounts or steal our identity. Fortunately, not all types of biometrics used to authenticate online interactions are the same. A much less invasive, and more consumer-friendly, technique leverages signals generated by the way in which we interact with the online world around us. This is called passive behavioural biometrics.

Think about how you use your smartphone, laptop, or computer, to interact with websites and applications. Do you realize that you have a unique way of holding your device that’s different from other people, if only slightly? Does your phone tilt a little to the left? Do you normally hold your phone in portrait or landscape mode? Do you use your index fingers or thumbs to type? How hard do you press on the screen when you hit each key? Passive behavioural biometrics collects this data, plus hundreds more unique data points and interactions without storing any personally identifiable data about you.

Collecting of these signals creates a unique profile for each authentic user, and is analyzed for risk. Understanding how good users behave, organizations can easily identify when the account owner is not the one attempting to authenticate, even if the correct login and password is presented in conjunction with the authentic account holder’s device.

Contrary to the physical biometric factors mentioned above, behavioural signals that make up a behavioural biometric profile aren’t stored and can’t be stolen, duplicated or reused – so they have no value to criminals. This kind of data collection is frictionless for the user; they do not have to enter, enroll in or provide any additional information to a website or application to benefit from its protection. They keep doing what they are used to doing: interacting with the sites and services as they always have. A true seamless experience.

In an era where even our physical biometric data can be stolen, there’s no question that better secure authentication methods are needed that are not subject to the scourge of hacking and theft. Physical biometrics will continue to have a place during face-to-face identity verification and seem like a good idea for online authentication – until you realize that they can be digitally stolen and re-used fraudulently, leaving the owner of that biometric with no recourse. Fortunately, behavioural biometrics has emerged as a reliable alternative for online user authentication. Data collection is non-invasive and the data cannot be faked, creating an authentication process that reduces the risk for both the company and the genuine good users.

DISCLAIMER: BiometricUpdate.com blogs are submitted content. The views expressed in this blog are that of the author, and don’t necessarily reflect the views of BiometricUpdate.com.

Article Topics

 |   | 

Latest Biometrics News

 

Facial recognition comes to the fairway with US Open deployment

For the second year in a row, the U.S. Open golf tournament is using biometrics for ticketing, through partnerships with…

 

NZ retailers adopt FRT to fight retail crime following privacy commissioner report

A host of major New Zealand retailers are signing up to deploy facial recognition technology in their stores. The bosses…

 

Au10tix gets nod from Microsoft as ideal solution for AI agents built in MS Copilot

At the recent Microsoft Build event, Au10tix got a public endorsement from Microsoft as an ideal match for AI agents…

 

UK inches towards digital ID clarity with passage of Data (Use and Access) Bill

The UK House of Lords has passed the government’s Data (Use and Access) Bill, bringing it within a royal signature…

 

Age assurance trial on track amid sweeping online regulatory changes in Australia

Australia’s Age Assurance Technology Trial is going well, thank you very much. That’s the gist of a post from the…

 

Airport biometric screening expands across US amid calls for federal oversight

Biometrics is steaming ahead for air travel, but privacy concerns remain while consumers are still becoming aware of modern changes….

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events