Sighthound boasts LFW benchmark results for its facial recognition software

December 17, 2015 - 

Sighthound, Inc. announced that its facial recognition system ranks first – with an accuracy of 99.79% – against the Labeled Faces in the Wild (LFW) benchmark database hosted by the University of Massachusetts.

Previous highest accuracy scores include Google at 99.63% and Baidu at 99.77%.

The company says that its facial recognition system stands out by using less than 2% of the amount of training data used by Google, and by using only one crop per image. Others on the LFW results list use up to 25 crops of the same image, with each crop showing a small part of the face, slowing down or prohibiting real world applications.

The company also claims that its algorithms are robust to both verification tasks such as LFW and identification tasks such as tests against the PubFig200 database, resulting highly accurate software, capable for a variety of real world use cases and runs in real time.

“We have spent 3 years quietly working on the commercial viability of deep learning and computer vision,” said Sighthound CEO, Stephen Neish. “These results are testament to the caliber of PhDs we hire and our focus on real-time, real world uses of our research. In this case we were forced to build more intelligent deeply learned networks because we don’t have access to the amount of data that Google or Baidu have.”

Sighthound is adding the facial recognition system to the set of developer APIs freely available on Sighthound Cloud, joining the range of products available that use Sighthound’s computer vision algorithms including face detection, person detection, gender recognition and facial landmark identification.

Leave a Comment

comments

About Stephen Mayhew

Stephen Mayhew is the publisher and co-founder of Biometrics Research Group, Inc.. His experience includes a mix of entrepreneurship, brand development and publishing. Stephen attended Carleton University and lives in Toronto, Canada. Connect with Stephen on LinkindIn.