Telling your phone you want the rating on this restaurant
A trio of researchers have proposed using a smartphone user’s head as a cursor to help with often spotty voice queries. Their software, WorldGaze, employs voice, facial and object recognition tied into an iPhoto’s front and rear cameras.
In a research paper, two researchers from Carnegie Mellon University and one from Apple Inc. describe how voice-enacted searches are typically no more accurate than queries keyed into searches.
Saying, “Does El Toro Taqueria deliver” into a phone — even if the users is standing in front of restaurant — usually elicits general information from the Internet.
That is because, despite some engineering effort and much more marketing messaging, even GPS-enabled mobile applications possess too little context in order to give highly relevant responses.
Sven Mayer and Chris Harrison of the Human-Computer Interaction Institute within Carnegie Mellon, and Chris Harrison of Apple have created a way to draw an imaginary line from between a phone user’s eyes, using the front camera, to the world beyond as seen by the rear lens.
WorldGaze, which its inventors say could be downloaded on any phone that allows front and rear cameras to open simultaneously, hears the question, notes where the user is looking and makes the connection. As a reference the app paints a red ray on the screen so that the user is sure it is focused where they are.
The app can work indoors as well.
A video produced by the team shows a woman in a digitally connected conference room adjusting lighting and the thermostat and interacting with a streaming service’s menu displayed on a screen. It also demonstrates how the software might work when comparison shopping for, in this case, furniture.
Apple is among several other companies with patent applications for gaze detection systems running on consumer electronics recently filed or published.