Robotics 2017: Pointcloud with colors

A project of the Robotics 2017 class of the School of Information Science and Technology (SIST) of ShanghaiTech University. Course Instructor: Prof. Sören Schwertfeger.

Chen Hongyu

Robots are typically equipped with multiple sensors,which require calibration in order to present sensed information in a common coordinate system. In order to increase accuracy and robustness in state estimation for robotics, a growing number of applications rely on data from multiple complementary sensors. For the best performance in sensor fusion, these di_erent sensors must be spatially and temporally registered with respect to each other. Most methods for state estimation that fuse data from multiple sensors assume and require that the timestamps of all measurements are accurately known with respect to a single clock. Consequently, the time synchronization of sensors is a crucial aspect of building a robotic system. Our mapping is equipped with many kinds of sensors such as the cameras, velodyne laser sensor,2D laser sensors as well as the IMU. In order to use the data from theses sensors. The _rst thing is to calibrate all the sensors mounted on the car.

System description

In this project we have to calibrate all the sensors on our mapping robot. It include calibrate the two cameras in front of our jackal robot. The stereo camera on the top of the top of the jackai robot. The relationship between the two velodyne laser sensors. Also we have to calibrate the relationship between the velodyne sensor and the top camera which on the top of jackal robot. Another thing we have to do is learn how to calibrate the 2D lidar sensor ,as well as the 2D lidar sensor and the camera. To summarize we have to calibrate all the sensors on the jackal robot.


Fig 1: Mapping robot

Calibrate the stereo camera


Fig 2: Calibration result of the camera 7 and camera 8.

Calibrate the velodyne and camera


Fig 3: Pointcloud with colors. In order to make contrast, all the green points belong in the camera view. All the red points are not in the camera view.


Fig 4: Project points to image

Calibrate the two velodyne laser sensors


Fig 5: Calibration result of the two velodynes

Calibrate the 2D laser sensor and the camera

 
Fig 6: Calibration result of the 2D laser 6 and the camera 8. Project laser point to image

Conclusions

If we want to fuse al the data ,the _rst thing is to calibrate all the sensors on the robot. So the main goal of this project is to calibrate all the sensors mounted on the jackal robot. In this project I have calibrated most of the sensors. Use the same way ,we can calibrate all the sensors on the mapping robot. This is an very important step for fusing all the sensors' data on the mapping robot. Also we can use optimize method to optimize the relationships between two sensors. Also there are still a lot of work to do. For example, all the calibration are calibrated o_ine which needs a lot of time to calibrate all the sensors.