How to Set up Sensors In a 3D-Lidar SLAM-Friendly Way
Optimizing the SLAM system for the best performance isn’t easy. It requires an in-depth understanding of the algorithm, selecting the well-suited sensors based on the intended use case, tuning the SLAM parameters for optimal performance, and so on.
We have written in-depth tutorials and articles on the topics above for 3D-Lidar SLAM, and we’ll leave the links to each at the end of the article if you’re interested in knowing more. However, even if you do all of these steps correctly, if the sensors are not set up properly, the setup will likely cause an unnecessary challenge for SLAM, and its performance will deteriorate.
In this article, we want to shed some light on common mistakes of the SLAM setup, what’s wrong with them, and how to fix them. Please pay close attention because we’ll tell you everything you need to maximize the 3D- Lidar SLAM performance.
Lidar tilting can be challenging to the 3D-Lidar SLAM accuracy
Generally, the more you tilt the lidar, the more challenging it gets to run the SLAM system accurately (especially in outdoor scenarios).
Some applications do require you to tilt the lidar:
- Road maintenance mapping applications benefit from lidar tilting as it helps to obtain more points from the road surface shown in figure 2.
- Indoor applications where the roof also needs to be detected benefit from a 90-degree tilt.
- However, the loop closure gets more difficult in these scenarios, and the overall robustness deteriorates.
The critical point is understanding the trade-off between lidar tilting and SLAM performance. We also have a demo video showing our 3D-Lidar SLAM working with a 45-degree tilted lidar (Link at the end). However, as shown in figure 3, it demonstrates that there is a wide difference of how well the lidar can capture surrounding objects and structures between these different lidar mounting, which eventually affect 3D-Lidar SLAM performance.
It’s best to understand the specific use case, analyze the trade-off and pick what works best for you.
What you need to know about the positioning between IMU and the Lidar
Multi-sensor fusion is a versatile and valuable approach for SLAM since it helps us realize the high-precision . However, when you fuse additional sensors with the 3D-Lidar, the extrinsic calibration must be accurate between the multiple sensors.
The sensors are mounted differently to each other, having a translational and rotation position relationship between each other. The rule of thumb is that keeping the sensors as close as possible reduces the risk of having more significant extrinsic errors, especially Lidar and IMU, because it’s easier to measure the accurate extrinsic.
For example, if IMU is a few centimeters from the lidar, you’ll have an error in the order of millimeters. If the IMU is several meters from the lidar, you’ll likely have an error in the order of centimeters. Ideally, you’d want to keep your IMU or INS just under your 3D-Lidar.
Position of Lidar and GNSS respect to each other could affect the performance negatively
If GNSS antenna is in the Lidar field of view, it partially blocks the lidar to capture surroundings. Similarly, the lidar can block the view of GNSS antenna when the lidar is above the sky dome of the antenna. This will make it harder for the GNSS to see enough number of satellites to get reliable signals. Hence, keeping these two pieces of equipment close is not recommended to minimize interferences.
On the other hand, as we saw earlier with the case of IMU and lidar, achieving an excellent extrinsic calibration is easier when they are kept close. Our challenge is maintaining a good balance such that the interference is minimized while the extrinsic calibration error is acceptable.
Time synchronization between multiple sensor data
We’ve seen the lack of time synchronization as the source of error for deteriorated SLAM performance many times.
The data from multiple sensors should have time stamps on the same clock, i.e., the t=0 timestamp of each sensor data should be precisely at the same time. Please pay extra attention to this factor in your setup, as this may lead to a nun-functional SLAM system if not addressed correctly.
To demonstrate the effect further, let’s look at an internal dataset we have at Kudan. We detected a 3.5s time difference between the lidar and the GNSS dataset – indicating the two sensors were out-of-sync. Figure 4 pictures the point cloud you’d get when such instances occur. The green line is the GNSS trajectory, the purple is Lidar-only-SLAM, and the red lines indicate the differences. Double edges and blurry lines can easily be noticed in the point cloud.
Per-point time stamp: Things you need to understand
When a 3D-Lidar emits lasers, each point is emitted at a slightly different time, even if it’s in the same scan cycle.
This may not sound very clear to you, but let’s take an example to explain it better. In a 360-degree scanning lidar, a single scan takes 100ms for the 10Hz mode. This means that there is approximately a 100ms time difference between the first and the last point of the scan. Assuming that these two timestamps are the same is incorrect and is a known source of error.
In the cases of dynamic movement, sensor movement between lidar frames is generally non-linear. In these scenarios modeling individual timestamps can be challenging without measuring the movement itself. IMUs are extremely useful for this purpose .
The overarching idea is that when you set up sensors and collect data, it’s vital ensuring that a timestamp is obtained for each lidar point.
At Kudan, through our blog, we’ve aimed to distill the information and present it as simply as possible on all things related to SLAM.
Here are some of our previous articles on 3D Lidar SLAM:
- 3D Lidar SLAM: The Basics
- How to Select the Best 3D Lidar for SLAM
- How to Tune 3D-Lidar SLAM Parameters
In today’s article, we went a step further and presented a few tips on how you can set up the sensors in a 3D-Lidar SLAM-friendly way and the trade-offs involved with setting them up otherwise.
We hope these tips helped avoid common blunders and squeeze out the best performance for your SLAM systems. If you’re interested to know more or have questions in your mind specific to your use case, think no further but say hi to us — they’ll be our questions to answer.
 Du, T., Zeng, Y.H., Yang, J., Tian, C.Z. and Bai, P.F. (2020), Multi-sensor fusion SLAM approach for the mobile robot with a bio-inspired polarised skylight sensor. IET Radar Sonar Navig., 14: 1950–1957. [PDF]
 Karam, Samer & Lehtola, Ville & Vosselman, George. (2020). STRATEGIES TO INTEGRATE IMU AND LIDAR SLAM FOR INDOOR MAPPING. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences. V-1–2020. 223–230. [PDF]