Google Pixel 4 to feature 3D facial biometrics for unlocking, payment and apps

The forthcoming flagship smartphone from Google, the Pixel 4, will feature 3D biometric facial recognition unlocking, the company has announced.

Like Face ID, Google’s facial recognition system features a dot projector and a flood illuminator, along with dual infrared cameras for face unlocking. The Pixel 4 will also feature radar-based gesture recognition, or “Motion Sense,” enabled by technology from Project Soli. Gesture recognition is not expected to be available in some countries where the Pixel 4 is sold.

In a blog post and video which cap months of leaks and speculation, Google provides details of the phone’s biometric system and other key features. The Soli technology allows the phone and face unlock sensors to turn on as the user picks it up, and to then unlock “all in one motion,” according to the post. 3D facial biometrics can also be used for user authentication for payments and app logins as supported by Android Q. The biometric feature also works in any orientation, including upside down.

One of the IR cameras that flank the top bezel set-up captures 460 pixels, while the other captures 505, and together they will likely be used to gather parallax-based depth data. User facial data will be stored on-device in the Titan M security chip.

The U.S. Patent and Trademark Office published a patent application from Google for 3d facial recognition last month, Patently Apple reports. The patent describes the use of reflected electromagnetic radiation at a certain wavelength, such as near-infrared light, captured in stereo by multiple sensors and combined in a depth map. The patent was originally filed in December, 2017.

An early leak of Android Q code also suggested the OS would natively support hardware for 3D facial biometrics.

Gesture recognition can be used for skipping songs, silencing phone calls, and other capabilities, and more will be added over time.

Search Engine Journal notes that a Google patent filed several years ago described a system for applying user emotion as input in search engine rankings. A new patent has now been filed with the USPTO by the company for “Graphical image retrieval based on emotional state of a user of a computing device,” which describes analyzing users’ facial features biometrically to determine what their emotions are.

“Examples of emotion classification tags include, for instance, eye shape, mouth opening size, nostril shape, eyebrow position, and other facial features and attributes that might change based on human emotion,” the filing says. “The computing device may then identify one or more graphical images with an emotional classification (also referred to herein as “a human emotion”) that is associated with the one or more emotion classification tags.”

The feature could be added to Pixel phones, Search Engine Journal suggests, although it is not mentioned in the brief blog post from Google.

The Pixel 4 is expected to be launched commercially in the fall.

Related Posts

Article Topics

 |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics