In this chapter we discuss two important problems in robotic manipulation. The first
problem deals with manipulation in an uncalibrated environment. The second problem deals
with motion planning with multisensor fusion. A third related problem that is not the main
emphasis of this chapter is servoing. Of course, in each of these problems, "vision" plays an
important role. This chapter emphasizes that in many instances, vision alone is not sufficient,
and one has to combine visual information with one or more additional sensory inputs. This
leads to many multisensor fusion-based algorithms, which are discussed in this chapter.
Before we elaborate on these algorithms, we make a few background and somewhat historical
Control of robot manipulators with vision in the feedback loop has an exciting history
starting with the pioneering work of Hill and Park [1] and Weiss, Sanderson, and Neuman
[2]. Subsequent work in this area has focused on visual servoing, wherein the emphasis is on
visually locating the position and orientation of a part and controlling a robot manipulator
to grasp and manipulate the part. If the part is not stationary, then the process of locating
the part and repositioning the robot must be performed by utilizing feedback control, which
has been studied in [3-7]. Using vision in the feedback loop has many advantages over the
more direct "look and go" approach. Some of the advantages are that a visually guided robot
is more flexible and robust and has the potential to perform satisfactorily even under
structural uncertainty. This is evidenced by the "controlled active vision" scheme introduced
by Papanikolopoulos
et al.
[8], where the goal is to accomplish a task in spite of
environmental and target-related unknown and possibly changing factors. Other instances of
visual guidance have been evidenced by the work of Allen
et al.
[3], when the objective is to
grasp a toy train undergoing a planar circular motion. The position of the train is observed
visually and the orientation is automatically specified by the position.
The concept of multisensor fusion is to combine data from multiple sensors to obtain
inferences that may not be possible from a single sensor alone [9]. Without going into the
details of the specific reason why a single sensor, for example, the visual sensor, cannot be
used reliably for all the different tasks that we propose to perform, we note that the main
purpose of using multisensor fusion is to compensate for the speed of computation. The
vision system we use is neither fast nor accurate--hence the need for "sensor fusion."
There are many other multisensor fusion schemes in the literature [10-21]. For example,
Allen and Bajcsy [17] used stereo edges to match objects to a fixed world model and then
adopted a tactile sensor to investigate the occluded parts. Flynn [18] has combined a sonar
and an infrared proximity sensor in order to reduce errors inherent in both sensor domains.
et al.
[19] presented a new method for combining data from intensity and range
sensors. Algorithms based on fusing static thermal and visual images obtained from outdoor
scenes have been reported by Nandhakumar and Aggarwal [20] and Mitiche and Aggarwal
As opposed to multisensor fusion-based servoing, where the sensory information auto-
matically generates the feedback control, in this chapter we propose multisensor fusion-
based planning, where the sensory information automatically feeds the planner. The motion
planning schedule is generated autonomously as a result, and the robot controller simply
follows the motion plan. This simplifies the problem of controller synthesis while relieving
the computational burden of the controller. On the other hand, the planner has an additional
structure, since it now receives (multi) sensory input. Because "planning" is not performed in
real time, it suffices to use a planner with a somewhat slower computational capability.
It has long been recognized that sensor-based control is an important issue in robotics.

Get Control in Robotics and Automation now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.