7 EXPERIMENTATION 373
10000
16000
14000
12000
6OOO
i i a | g ! II i i ! |l | i I
4OOO
2OOO
0
0
250 450
500
ItenIUon
.......... . --. -- |
5O tO0 150 2OO 300
350 4OO
FIGURE 12.20
T
xTWIx1
"~- X 2 W2X 2.
reduction of this value represents the reduction of bias. Parametric update algorithm is applied.
a way that two ellipsoids intersect. The followings are governing equations to update
parameters.
1
TW2x2
2
(12.35)
P = ~
x1TWIx1 + X 2
OP
J = -- (12.36)
O(k) = if(k) - jV(p _ p,)
(12.37)
r WlX + x~Wzx
is less than or equal to 2, then we assume that there is
If the value of X l 1 2
no bias effect and no update method is applied. Otherwise, an update method is applied
because we can assume that there are possible error sources other than noise.
The parametric update method is applied and results are shown in Figures 12.19 and
12.20. As shown in Figure 12.19, the error is reduced and biased parameters are converged
to the true value by the parametric update method.
7 EXPERIMENTATION
We applied the perception net-based self-calibration method to the automatic calibration of
the stereo camera mounted on the base, which provides 3-D data for the Mars sampling
manipulator. More specifically, we intend to remove the biases involved, in particular, in the
orientation of the stereo camera with reference to the base frame. It is known that the 3-D
374
CHAPTER 12 / ROBOTICS WITH PERCEPTION AND ACTION NETS
FIGURE 12.21
Picture of the Lightweight Survivable Rover (LSR-1), rover-mounted manipulator, and MicroArm-1. A camera is
shown in the bottom of the right corner.
data from the stereo camera are very sensitive to the precise setting of camera orientation in
terms of the base frame. The capability of a system to self-calibrate such biases should allow
the system performance to be very robust. Figure 12.15 illustrates the perception net
configured for the self-calibration of stereo camera pose in terms of the base frame. A
sequence of 3-D positions of a single feature point on the manipulator is measured by the
stereo camera as well as by the encoders while the manipulator is in motion.
Our experimental platform consisted of the National Aeronautics and Space Administra-
tion (NASA)-Jet Propulsion Laboratory (JPL) Lightweight Survivable Rover (LSR-1) and
rover-mounted manipulator MicroArm-1 Link parameters are shown in Table 12.4. LSR-1
is a six-wheeled, skid-steered, rocker-bogie design, having approximately half the mass (7 kg)
and twice the deployed volume of the Sojourner rover used in the Mars Pathfinder mission.
MicroArm-1 is a 1.5 kg, all-composite five-degrees-of-freedom manipulator arm, 0.7 m at full
extent, driven by piezoelectric ultrasonic motors, possessing a multifunction powered end
effector [25]. The picture is shown in Figure 12.21.
A black-and-white stereo CCD camera pair, with 512 x 486 resolution, as shown in
Figure 12.21, 10-cm baseline, and 130 ~ field-of-view lenses (camera parameters shown in
Table 12.3), was mounted on LSR-1 directly (4 cm) beneath MicroArm-1. This camera pair
was calibrated using a least-squares calibration technique, producing a camera model that
attempts to compensate for radial lens distortion. A black calibration grid having a 16 13
array of 5-mm-diameter white calibration circles with 1-cm center spacing was presented to
the cameras in both the horizontal and vertical configurations to provide calibration points
for the least-squares camera model estimation [26].
A VME chassis containing one 68040 processor running VxWorks and two Galil motion
control boards were used to control the LSR-1 rover and MicroArm-1 robotic manipulator.
A Sun Sparc SLC was used as a terminal to connect to the VME chassis. The Sparc SLC

Get Control in Robotics and Automation now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.