FB pixel

id3 fingerprint algorithms rank high in NIST tests


id3 Technologies has been ranked number two and three in the MINEX Ongoing tests, a performance demonstrating the high performance level of id3 fingerprint matching algorithms.

Run under the auspices of National Institute of Standards and Technology (NIST), the Minutiae Interoperability Exchange Test (MINEX) is an ongoing evaluation of the ANSI-INCITS 378 fingerprint template.

ANSI-INCITS 378 is the international standard format used to store fingerprints for automatic recognition and interoperability. These are tested in terms of level of matching, false acceptation, and false rejection, against a wide sample of fingerprints.

The test program provides measurements of performance and interoperability of core template encoding and matching capabilities to users, vendors and interested parties, and establishes compliance for template encoders and matchers for the United States Government’s Personal Identity Verification (PIV) program.

The Ongoing MINEX program evaluates template encoding and matching software submitted to NIST in the form of a software development kit (SDK) library.

Jean-Louis Revol, CEO and co-founder of id3 Technologies, said: “id3 ranking with MINEX Ongoing tests demonstrates the high level of performance delivered by our algorithms. This performance level is the result of our long-standing experience, and broad set of biometrics applications.”

The high ranking for id3 algorithms means is that it showed a high level of accuracy, along with low false acceptation/false rejection rates. Products using id3 algorithms can easily pass tests for the U.S. market, including FIPS-201 for PIV (Federal Personal Identity Verification) program and for other markets as id3 algorithms follow the standardized (ISO/IEC 19794-2) interchange formats for finger minutiae.

Article Topics

 |   |   |   |   | 

Latest Biometrics News


3 Replies to “id3 fingerprint algorithms rank high in NIST tests”

  1. What I was looking for was some actual numbers for false positive and negative scores. I did not see any suggesting they were not accurate enough to present? Not sure we can derive any conclusions about stats one way or the other. NIST has certain stadards that can be compared to who not have used them? I am confuded on this lack of opening up the results.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics