New Microsoft hand tracking technology could change how designers work

New Microsoft platform Handpose would enable designers to work digitally through hand gestures, and is currently in its demo phase.

Screen Shot 2016-06-30 at 15.36.30

Microsoft researchers are in the process of engineering a gestural user interface technology that may enable designers to create every day designs through simple hand motions.

The new demo of the platform, called Handpose, recognises detailed hand movements and allow users to manipulate technology through gestures. Microsoft says the tracking system that is more accurate than its predecessors and requires minimal processing power, giving it potential for use in computers, tablets and VR headsets.

“If we can make vision work reliably, speech work reliably and gesture work reliably, then people designing things like TVs, coffee machines or any of the Internet of Things gadgets will have a range of interaction possibilities,” says Andrew Fitzgibbon, a principal researcher with the computer vision group at the UK lab.

The system poses a solution to problems like occlusion and noisy data that have caused difficulties in hand tracking devices since 1993. The computer scientists and engineers behind the project claim they are close to advancing hand and gesture recognition tools enough for mainstream use. Its widespread usability would signal a turning point for interactive and responsive design.

Visible hands

“How do we interact with things in the real world? Well, we pick them up, we touch them with our fingers, we manipulate them,” says Jamie Shotton, a principal researcher in computer vision at Microsoft’s Cambridge research lab. “We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them.”

Still in its research phase, the system tracks subtle hand motions and allow the user to perform tasks like poke a stuffed bunny, press buttons or move a slider while observing their movements through a virtual reality headset or screen monitor. Handpose allows users to see what their hands are doing and addresses a common disconnect that occurs when VR users can’t see their own hands, which often results in nausea.

For years researchers have faced difficulties in tracking hands because they are small and complex. Gestures like rotating the wrist or forming a fist would cause the subject’s fingers to disappear from view and confuse the software.

The solution

The new system creates a smooth surface model of the hand tracked by data points. It then uses an algorithm that was developed in the 1940s to calculate where the tracking points should go next, creating a fluid image and gesture.

The team of researchers behind Handpose were surprised to find that a lack of haptics – technology that mimics the sensation of touching a real object – didn’t vastly affect the usability of the system.

That is due, in part, to the company’s design of the virtual world. The researchers created virtual controls that are thin enough for users to touch their fingers together to imitate the sensation of touching something physical. They also developed sensory experiences that varied in resistance and authenticity.

Start the discussionStart the discussion
  • Post a comment

Latest articles