Microsoft mimics hand motion and sensation for virtual reality systems
Microsoft researchers are looking at a number of ways in which technology can start to recognize detailed hand motion.
The software giant will demonstrate an example of its new 3-D hand tracking research at SIGGRAPH 2016, an international conference that focuses on computer graphics and interactive techniques, later this summer.
Entitled “Handpose“, the gesture-based control system is designed to allow people to “easily and intuitively” interact with virtual reality systems. Microsoft’s research team is working to create a precise and accurate system that uses a minimal amount of processing power to mimic tactile experiences for VR environments.
At the SIGGRAPH event, researchers will discuss their findings about virtual controls that are thin enough that a user can touch their fingers together to obtain the experience of touching something hard. They will also talk about sensory experiences that allow users to push against something soft and pliant rather than hard and unforgiving, which appears to feel more authentic.
In a Microsoft blog post, Jamie Shotton, a principal researcher in computer vision at Microsoft’s research lab in Cambridge said: “How do we interact with things in the real world? Well, we pick them up, we touch them with our fingers, we manipulate them. We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them.”
The firm claims that one of its key future goals is to provide more personal computing experiences by creating technology that can adapt to how people move, speak and see, rather than asking people to adapt to how computers work. Handpose will be a key component of Microsoft’s efforts.