Interacting with gestures

In the last two chapters, we explored the air-tap and manipulation gestures, including tracking the user's hands using InteractionManager; in this section, we will introduce the navigation gesture and revisit the manipulation to give the user full control over the robot hand via the inverse kinematics target, and hand tracking to allow the user to "push" the robot arm around.

Similar to the manipulation gesture, the navigation gesture becomes active when a hold state is detected (when the user performs an air-tap, but holds their finger down), but differs in that it can be locked to a specific axis/axes. It has its result normalized between -1.0 to 1.0 based on the offset the user's hand is from the position that ...

Get Microsoft HoloLens By Example now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.