Chapter 8. Robotics and Natural User Interfaces
This chapter is all about interacting with machines. Whether it’s your own PC or an autonomous robot, the Kinect is perfect for interpreting your gestures and applying them to a particular interface. We’ll touch on controlling a Mac by dragging and exposing windows, using a wide variety of gestures to control everything from browsing to picture viewing on your Windows PC, and controlling your mouse with the Kinect using Ubuntu.
One of the most compelling and futuristic Kinect applications is using it to control autonomous robots. Practical uses for this include being able to control the movement of the robot’s arms and legs from a remote location. Imagine being able to send a robot into a small area and control it remotely by walking on a treadmill and moving your arms.
This is all possible thanks to Taylor Veltrop’s amazing work with the Kinect along with his NAO Robot avatar. In this chapter, you’ll be introduced to some coding examples applied to robotics using OpenNI and NITE along with the Kinematics Dynamic Library (KDL) and the Robot Operating System (ROS).
Control a Robotic Arm
This hack explains how to control a robotic arm with four degrees of freedom through the Kinect. It requires the usage of OpenNI and NITE. NITE provides skeleton tracking of the Kinect user. It’s really just a simple matter of plugging NITE’s skeleton API into the joint control API of your robot, with a little bit of math in the middle. To do the necessary ...