June 2018
Intermediate to advanced
484 pages
11h 36m
English
The real robot could have several sensors to perceive the world. We can have many nodes to receive this data and perform processing, whereas the navigation stack can only use the planar range sensor by design. Here, the sensor node must publish the data with one of these types: /sensor_msgs::LaserScan or /sensor_msgs::PointCloud2.
We will use the laser located in front of the simulated mobile robot to navigate the Gazebo world. This laser is simulated on Gazebo, and it publishes data on the hokuyo_link reference frame with the topic name /robot/laser/scan. Here, we do not have to configure anything for the laser to use in the navigation stack since TF already configured it in the .urdf file, and the laser is publishing data the correct ...