Planetary Rover SLAM
Wang Zhiwei and Yan Yihui
SLAM (Simultaneous Localization and Mapping) is central to a range of indoor, outdoor, in-air and underwater applications for both manned and autonomous vehicles. Because Rover is imperfective, our project aims to implement an odometry on Rover, which is a crutial base stone of SLAM. Our endeavor to implement the good enough odometry will be meaningful to the further job.
The system hardwares include lidar sensor, motors and main computer. We mount a RoboSense LiDAR RS 16 on the top of our rover. We use Brushless DC Motors from Maxon Motors. We use raspberry pi 3b+ as the robot brain which is a small single-board computers with a 1.2 GHz 64-bit quad core processor, on-board 802.11n Wi-Fi, Bluetooth and USB boot capabilities, com- municating with the motor control via Universal Serial Bus (USB) port and running SLAM algorithms. We use maxon EPOS as the motor control. EPOS is a modular, digital positioning controller by maxon. The wide range of operating modes, as well as various command interfaces, make it versatile for use in many different drive systems in the fields of automation technology.
The rover has 10 motor, including 6 motors in profile velocity mode, setting velocity to control rover moving at a constant speed and 4 motors in profile position mode, setting the position of the wheels to control rover turning to target degree . We choose Maxon, a Swiss manufacturer and supplier of high-precision drive systems, as the rover's motor provider. According to the Command library of EPOS Positioning Controllers, we can set every wheel's velocity and degree based on the Ackerman steering.
After obtaining the status of the motors, the directions of corner motors and the moving distances of drive motors, we can calculate the displacement and rotation of the rover. The rover moves in two ways, moving straightly and turning. As shown in the above picture, we calculate them respectively.
Firstly, we need to calculate the rotation radius of the center of Rover(r). Then we need to know how much degrees Rover rotates (Δθ). After calculating r and ∆θ, we can get the translation of Rover(Δx, Δy). Now, we have the transform(Δx, Δy, ∆θ) of Rover in each time slot(Δt). It is easy to get the odometry.
Hector slam uses the hector_mapping node for learning a map of the environment and simultaneously estimating the platform's 2D pose at laser scanner frame rate without odometry. We input the laser_scan topic provided by pointcloud_to_laserscan node in the hector_mapping node and set the the current estimate of the robot's pose within the map frame, called 'pub_map_odom_transform'. Then, from hectorSLAM, we can get the estimated 2D pose of rover. While the system does not provide explicit loop closing ability, the hectorSLAM is sufficiently accurate for many real world scenarios. So we consider the estimated 2D pose as the ground truth.
Figures above show the trajectory from odometry and 2D pose estimated by hectorSLAM. The green trajectory is generated by odometry and the yellow one is generated by hectorSLAM. It can be seen from the two figures that our odometry succeeds in determining the position and orientation of rover. However, some deviations exist between the trajectory from odometry and 2D pose estimated by hectorSLAM, partly due to the offset from the zero point of turning motor. Figure below shows the comparison of total trajectory.
Figure below shows the 2D map generated by rover with hector_slam.
The video demo of rover and hectorSLAM are shown below
Rover control demo:
HectorSLAM and our Odometry:
Our project is mainly about the odometry of Rover, the crutial base stone of mapping and navigation. We obtain the motor status with maxon's library, and then implement the odometry on Rover using the motor status. We compare the trajectory generated by our odometry with the trajectory published by hector_slam. The result shows that our odometry is good enough, although there are some errors caused by the accuracy of motor status and the initial position of corner wheels. The motor status converting from the encoder is not perfectlly equal to the real value. The initial position of corner wheels is bias from the straight forward direction.