Researchers develop authentication method based on lip motion
Researchers at Hong Kong Baptist University have developed a “lip motion password”, which depends on the motions of an individual’s lips to generate a password, according to a report by Techradar.
The technology, which was granted a US patent in 2015, takes into account the person’s lip shape, texture, movement and sound to determine a people’s identity.
“The same password spoken by two persons is different and a learning system can distinguish them,” Cheung Yiu-ming, who led the research, said.
In its study, Cheung and his team collected and examined samples of lip sequence to train the computational models and determine the threshold of accepting and rejecting a spoken password.
A lip password provides several benefits over conventional security measures. The authentication technique is harder to duplicate due to its combination of motion and content, less vulnerable to background noise than voice authentication, can be rapidly changed or reset, and has no language barrier.
According to Cheung, there are a number of potential applications, such as financial transaction authentication on mobile devices, ATM transactions and credit card user passwords.
The authentication method could also be used to reinforce security access systems on private property, especially if combined with other techniques, such as face recognition.