Biometrics can help smartphones work better and spot neurological disease
Biometric tools are getting a lot more information from a person’s eyes than just iris patterns. They are reporting how a user holds their phone and screening for ADHD.
Japanese researchers say they have found a way to automatically see how someone holds a phone and adjusting the position of content on the phone’s screen. The biometric algorithm shifts the content, making it accessible to the user’s fingers and thumbs.
The pupil becomes a part of a device’s human-machine interface.
The selfie camera captures the reflection of the camera in the user’s eyes. Fingers are shadows on the rectangular image. The researchers say that the app, called ReflecTouch, is 85 percent accurate at recognizing one of six hand postures for holding a phone.
Other than loading the biometric software, nothing needs to be added to the phone for it to work, according to a team of scientists at Tokyo University of Technology, Keio University and Yahoo Japan. Yahoo is sponsoring this week’s Conference on Human Factors in Computing Systems at which the scientists are presenting their work.
Meanwhile, University of California, San Diego, researchers say they have written an app people can use to screen themselves for neurological maladies including ADHD and Alzheimer’s disease. The work was done in the university’s Jacobs School of Engineering.
The scientists presented at the conference as well. They bill their work as at-home pupillometry using a phone’s near-infrared facial ID and color cameras. Together, they track absolute pupil dilation with sub-millimeter exactness.
The mean error rate for tracking pupil dilation was 3.52 percent according to the researchers’ paper. Tracking dilation is a standard test for cognitive processing and is also used in psychology.
biometrics | human-machine interface | mobile app | research and development | smartphones