<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>Kudan 3D-Lidar SLAM | Kudan global</title>
	<atom:link href="https://www.kudan.io/blog/tag/kudan-3d-lidar-slam/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.kudan.io</link>
	<description>Kudan has been providing proprietary Artificial Perception technologies based on SLAM to enable use cases with significant market potential and impact on our lives such as autonomous driving, robotics, AR/VR and smart cities</description>
	<lastBuildDate>Sun, 22 Dec 2024 13:14:06 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.8.13</generator>

 
<site xmlns="com-wordpress:feed-additions:1">179852210</site>	<item>
		<title>Squad Robotics will release its autonomous floor-cleaning robots with Kudan’s 3D-Lidar SLAM technology integrated</title>
		<link>https://www.kudan.io/blog/squad-robotics-will-release-its-autonomous-floor-cleaning-robots-with-kdlidar-integrated/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=squad-robotics-will-release-its-autonomous-floor-cleaning-robots-with-kdlidar-integrated</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Wed, 24 Jul 2024 03:00:51 +0000</pubDate>
				<category><![CDATA[Press Release]]></category>
		<category><![CDATA[autonomous floor-cleaning robots]]></category>
		<category><![CDATA[Autonomous Mobile Robot]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[SLAM]]></category>
		<category><![CDATA[Squad Robotics]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1872</guid>

					<description><![CDATA[<p>Kudan is pleased to announce that SIA SQR (hereafter “Squad Robotics”), known for its innovative approach to industrial cleaning robot solutions with its headquarters in Riga, Latvia, will commercially release its autonomous floor-cleaning robots, integrating Kudan’s advanced 3D Lidar SLAM algorithm, “KdLidar,” as the core engine for high-precision positioning. With a vision to revolutionize the [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/squad-robotics-will-release-its-autonomous-floor-cleaning-robots-with-kdlidar-integrated/">Squad Robotics will release its autonomous floor-cleaning robots with Kudan’s 3D-Lidar SLAM technology integrated</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>Kudan is pleased to announce that SIA SQR (hereafter “Squad Robotics”), known for its innovative approach to industrial cleaning robot solutions with its headquarters in Riga, Latvia, will commercially release its autonomous floor-cleaning robots, integrating Kudan’s advanced 3D Lidar SLAM algorithm, “KdLidar,” as the core engine for high-precision positioning.</p>
<p>With a vision to revolutionize the massive cleaning industry by transitioning from human-operated to AI-operated autonomous machines, Squad Robotics offers its autonomous floor-cleaning robots and related services tailored to both cleaning machine manufacturers and cleaning service companies. The company has partnered with major European manufacturers such as Stolzenberg and others, making their robots available across Europe.</p>
<div id="attachment_1874" style="width: 874px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1874" loading="lazy" class="wp-image-1874 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/07/Pic1-1.png?resize=864%2C323&#038;ssl=1" alt="" width="864" height="323" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/07/Pic1-1.png?w=864&amp;ssl=1 864w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/07/Pic1-1.png?resize=300%2C112&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/07/Pic1-1.png?resize=768%2C287&amp;ssl=1 768w" sizes="(max-width: 864px) 100vw, 864px" data-recalc-dims="1" /><p id="caption-attachment-1874" class="wp-caption-text">Autonomous floor-cleaning robots by Squad Robotics</p></div>
<p>To meet the growing market demands, Squad Robotics sought to enhance the operating environments for its autonomous floor-cleaning robots. This need led to a critical product development collaboration between Squad Robotics and Kudan, focused on integrating a more robust SLAM processing engine.</p>
<p>By incorporating Kudan&#8217;s advanced 3D-Lidar SLAM technology with a sophisticated multi-sensor fusion algorithm, Squad Robotics’ floor-cleaning robots achieve enhanced situational awareness and the capability to navigate in complex spaces with high precision and stability. This collaboration ensures that autonomous systems can operate effectively in dynamic and cluttered environments, significantly boosting the operational efficiency and productivity of cleaning robots.</p>
<p>Kudan&#8217;s 3D-Lidar SLAM is now integrated into Squad Robotics’ key products, including the SQR SW1, an autonomous sweeping robot, and the SQR SD1, an autonomous scrubber-drier. These advancements underscore the commitment of both companies to deliver cutting-edge solutions that address the rigorous demands of the industrial cleaning sector.</p>
<p>Comment from both companies:</p>
<p>“We are pleased to collaborate with Squad Robotics to integrate our 3D-Lidar SLAM technology into their autonomous floor-cleaning robots. This partnership underscores our commitment to enhancing the precision and reliability of autonomous systems in industrial applications. By combining our localization technology with Squad Robotics&#8217; innovative cleaning solutions, we aim to contribute to greater efficiency and productivity in the cleaning industry.&#8221;</p>
<p>― Daiu Ko, Chief Executive Officer at Kudan</p>
<p>“This product partnership with Kudan allows us to enhance the precision and performance of our robots, ensuring they operate seamlessly in diverse and complex environments. We are excited about the potential this collaboration brings to revolutionize the cleaning industry, making autonomous operations more accessible and efficient for our clients.&#8221;</p>
<p>― Matīss Brunavs, Chief Executive Officer at Squad Robotics</p>
<p><strong>About Squad Robotics</strong><br />
Squad Robotics specializes in developing autonomous floor-cleaning robots designed to enhance the efficiency and productivity of industrial cleaning processes. With a mission to revolutionize the cleaning industry, Squad Robotics integrates cutting-edge technologies to deliver innovative and reliable cleaning solutions. For more information, visit <a href="www.squad-robotics.com" target="_blank" rel="noopener">www.squad-robotics.com</a>.</p>
<p><strong>About Kudan Inc.</strong><br />
Kudan is a deep tech research and development company specializing in algorithms for artificial perception (AP). As a complement to artificial intelligence (AI), AP functions allow machines to develop autonomy. Currently, Kudan is licensing its technology for next-generation solution areas such as digital twin, robotics and autonomous driving.<br />
For more information, please refer to Kudan’s website at <a href="https://www.kudan.io/" target="_blank" rel="noopener">https://www.kudan.io/</a>.</p>
<p>■Company Details<br />
Name: Kudan Inc.<br />
Securities Code: 4425 (TSE Growth)<br />
Representative: CEO Daiu Ko</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/squad-robotics-will-release-its-autonomous-floor-cleaning-robots-with-kdlidar-integrated/">Squad Robotics will release its autonomous floor-cleaning robots with Kudan’s 3D-Lidar SLAM technology integrated</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1872</post-id>	</item>
		<item>
		<title>Avestec announces commercial launch of the SKYRON powered by Kudan’s 3D Lidar SLAM engine for digitization</title>
		<link>https://www.kudan.io/blog/avestec-announces-commercial-launch-of-the-skyron-powered-by-kdlidar/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=avestec-announces-commercial-launch-of-the-skyron-powered-by-kdlidar</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 09 May 2024 02:00:07 +0000</pubDate>
				<category><![CDATA[Press Release]]></category>
		<category><![CDATA[algorithm]]></category>
		<category><![CDATA[Artificial Perception]]></category>
		<category><![CDATA[Avestec]]></category>
		<category><![CDATA[digitization]]></category>
		<category><![CDATA[flying robots]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[partnership]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[SKYRON]]></category>
		<category><![CDATA[SLAM]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1800</guid>

					<description><![CDATA[<p>Kudan Inc. (headquarters in Shibuya-ku, Tokyo; CEO Daiu Ko), a global leader in advanced SLAM (Simultaneous Localization and Mapping) technology, proudly announces that Avestec Technologies Inc. (hereafter “Avestec”), a leading provider of flying robots for asset inspections across various industries with its headquarter in British Columbia, Canada, is commercially releasing flying robot, SKYRON, where Kudan&#8217;s [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/avestec-announces-commercial-launch-of-the-skyron-powered-by-kdlidar/">Avestec announces commercial launch of the SKYRON powered by Kudan’s 3D Lidar SLAM engine for digitization</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>Kudan Inc. (headquarters in Shibuya-ku, Tokyo; CEO Daiu Ko), a global leader in advanced SLAM (Simultaneous Localization and Mapping) technology, proudly announces that Avestec Technologies Inc. (hereafter “Avestec”), a leading provider of flying robots for asset inspections across various industries with its headquarter in British Columbia, Canada, is commercially releasing flying robot, SKYRON, where Kudan&#8217;s cutting-edge 3D Lidar SLAM algorithm “KdLidar” is adopted as the engine to generate high-quality point clouds as well as high-precision position estimation.</p>
<div id="attachment_1808" style="width: 688px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1808" loading="lazy" class="wp-image-1808 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/05/Pic2.png?resize=678%2C255&#038;ssl=1" alt="" width="678" height="255" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/05/Pic2.png?w=678&amp;ssl=1 678w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2024/05/Pic2.png?resize=300%2C113&amp;ssl=1 300w" sizes="(max-width: 678px) 100vw, 678px" data-recalc-dims="1" /><p id="caption-attachment-1808" class="wp-caption-text">Avestec SKYRON and contact-based measurement solutions</p></div>
<p>Avestec SKYRON is a flying robot with advanced technology capable of performing Ultrasonic Thickness measurements and Visual Inspections. While SKYRON was initially developed for contact-based UAV (Unmanned Aerial Vehicle, such as Drones) based remote sensing, Avestec identified growing market demands to expand SKYRON’s capabilities to support various remote infrastructure inspection for the oil and gas industry. The requirement to meet these diverse market demands has led to the pursuit of a more sophisticated and robust SLAM processing engine, setting the stage for a pivotal product development collaboration between Avestec and Kudan.</p>
<p>Leveraging KdLidar’s proprietary multi-sensor fusion algorithm, along with its adeptness in managing varied motion models during sensing, the new generation SKYRON is capable of producing very high-quality point clouds across a diverse range of sensing modalities. The outputs are proven to be accurate with higher resolution and precision under various dynamic mapping environments.</p>
<p>“This partnership brings together leaders in hardware and software capabilities,” said Juan Wee, CEO of Kudan USA. “to create and launch a powerful and precise end-to-end 3D mapping solution for a wide range of industries and applications, setting a new standard in the market.”</p>
<p>Reza Tavakoli, CEO of Avestec commented, &#8220;Innovation in sensor-fusion technology is bringing ever more possibilities to mapping, inspection, autonomous navigation and robotics. This released solution with Kudan is the essential step towards further expansion in the global market.”</p>
<p>This new generation of SKYRON, with Kudan&#8217;s SLAM technology, is set to disrupt the oil and gas industries, by enabling complete inspections faster, safer, and at a lower cost in over 30 countries served by Avestec.</p>
<p>Details about the commercial release (Q2 2024) of SKYRON with Kudan’s KdLidar and other enhanced features will be available on Avestec&#8217;s website (<a href="https://www.avestec.com/" target="_blank" rel="noopener">https://www.avestec.com/</a>). Watch out for more official launch activities in the coming months.</p>
<p><strong>About Avestec</strong><br />
Based in British Columbia, Canada, Avestec is at the forefront of technological innovation, specializing in the development and deployment of state-of-the-art flying robots dedicated to asset inspection. With a commitment to excellence and ingenuity, Avestec leverages the latest advances in unmanned aerial vehicle (UAV) technology to provide unique solutions that revolutionize the way asset inspections are conducted. By leveraging cutting-edge engineering and expertise, Avestec provides industry-leading aerial platforms that enable unparalleled efficiency, accuracy, and safety in the inspection of a wide range of assets in a variety of sectors.</p>
<p><strong>About Kudan Inc.</strong><br />
Kudan is a deep tech research and development company specializing in algorithms for artificial perception (AP). As a complement to artificial intelligence (AI), AP functions allow machines to develop autonomy. Currently, Kudan is licensing its technology for next-generation solution areas such as digital twin, robotics and autonomous driving.<br />
For more information, please refer to Kudan’s website at <a href="https://www.kudan.io/" target="_blank" rel="noopener">https://www.kudan.io/</a>.</p>
<p>■Company Details<br />
Name: Kudan Inc.<br />
Securities Code: 4425 (TSE Growth)<br />
Representative: CEO Daiu Ko</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/avestec-announces-commercial-launch-of-the-skyron-powered-by-kdlidar/">Avestec announces commercial launch of the SKYRON powered by Kudan’s 3D Lidar SLAM engine for digitization</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1800</post-id>	</item>
		<item>
		<title>Kudan’s strategic partner Movel AI starts commercial offering for integrated solution for AMR with Kudan SLAM</title>
		<link>https://www.kudan.io/blog/kudan-movel-partnership-commercial-milestone-2023/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kudan-movel-partnership-commercial-milestone-2023</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 16 Mar 2023 08:00:47 +0000</pubDate>
				<category><![CDATA[Press Release]]></category>
		<category><![CDATA[all-in-one solution]]></category>
		<category><![CDATA[AMR]]></category>
		<category><![CDATA[Autonomous Mobile Robots]]></category>
		<category><![CDATA[commercial offering]]></category>
		<category><![CDATA[Fleet Management System (FMS)]]></category>
		<category><![CDATA[fleet managemet]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[KdVisual]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[Kudan SLAM]]></category>
		<category><![CDATA[Kudan Visual SLAM]]></category>
		<category><![CDATA[Lidar SLAM]]></category>
		<category><![CDATA[localization]]></category>
		<category><![CDATA[Movel AI]]></category>
		<category><![CDATA[Navigation]]></category>
		<category><![CDATA[Robotic Navigation System (RNS)]]></category>
		<category><![CDATA[Seirios]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[SLAM]]></category>
		<category><![CDATA[Visual SLAM]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1599</guid>

					<description><![CDATA[<p>Kudan Inc. (headquarters in Shibuya-ku, Tokyo; CEO Daiu Ko, hereafter “Kudan”) is happy to announce that our strategic partner, Movel AI, has wholly integrated Kudan’s 3D Lidar ‘KdLidar’ and visual ‘KdVisual’ Simultaneous Localization and Mapping (SLAM) software into its all-in-one commercial software solution ‘Seirios’ for autonomous mobile robots (AMR). This Kudan SLAM integrated solution is [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/kudan-movel-partnership-commercial-milestone-2023/">Kudan’s strategic partner Movel AI starts commercial offering for integrated solution for AMR with Kudan SLAM</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>Kudan Inc. (headquarters in Shibuya-ku, Tokyo; CEO Daiu Ko, hereafter “Kudan”) is happy to announce that our strategic partner, <a href="https://movel.ai/" target="_blank" rel="noopener">Movel AI</a>, has wholly integrated Kudan’s 3D Lidar ‘KdLidar’ and visual ‘KdVisual’ Simultaneous Localization and Mapping (SLAM) software into its all-in-one commercial software solution ‘Seirios’ for autonomous mobile robots (AMR). This Kudan SLAM integrated solution is now available for the global market and is expected to have multiple commercial deployments on customer sites in the coming months.</p>
<p><img loading="lazy" class="aligncenter wp-image-1601 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic1_Movel-AI.png?resize=939%2C527&#038;ssl=1" alt="" width="939" height="527" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic1_Movel-AI.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic1_Movel-AI.png?resize=300%2C168&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic1_Movel-AI.png?resize=768%2C431&amp;ssl=1 768w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /></p>
<p>Movel AI, the Singapore-based robotics software company, is widely offering its clients this all-in-one localization, navigation and fleet management software through a seamless user interface which will benefit from Kudan’s robust and highly accurate positioning software.</p>
<p>Kudan and Movel AI began talks in 2022 and the aligned synergy between both companies made us start to collaborate on product development and business expansion towards AMR market.</p>
<p>This partnership enabled Movel AI to integrate Kudan’s 3D Lidar SLAM and Visual SLAM into its Seirios <a href="https://movel.ai/seirios-rns" target="_blank" rel="noopener">Robotic Navigation System (RNS)</a> and <a href="https://movel.ai/seirios-fms" target="_blank" rel="noopener">Fleet Management System (FMS)</a> to offer greater localization and mapping capabilities to its clients. Both new and existing clients can experience enhanced localization accuracy by up to 2cm, so that robots can more precisely position themselves and navigate in the environment. Kudan SLAM also allows the robots to operate in dynamic and complex environments with excellent robustness, reliability and operational efficiency compared to other open-source solutions.</p>
<div id="attachment_1602" style="width: 949px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1602" loading="lazy" class="wp-image-1602 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic2_Movel-AI.png?resize=939%2C550&#038;ssl=1" alt="" width="939" height="550" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic2_Movel-AI.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic2_Movel-AI.png?resize=300%2C176&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic2_Movel-AI.png?resize=768%2C450&amp;ssl=1 768w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /><p id="caption-attachment-1602" class="wp-caption-text">Figure 1: Seiros’ Lane Navigation Feature (Image Source: Movel AI)</p></div>
<p>&nbsp;</p>
<div id="attachment_1603" style="width: 949px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1603" loading="lazy" class="wp-image-1603 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic3_Movel-AI.png?resize=939%2C658&#038;ssl=1" alt="" width="939" height="658" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic3_Movel-AI.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic3_Movel-AI.png?resize=300%2C210&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic3_Movel-AI.png?resize=768%2C538&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic3_Movel-AI.png?resize=600%2C419&amp;ssl=1 600w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /><p id="caption-attachment-1603" class="wp-caption-text">Figure 2: Seirios FMS Manage Map Feature (Image source: Movel AI)</p></div>
<p>For Kudan, this collaboration and release of all-in-one solution allows us to offer customers navigation and fleet management solutions on top of our SLAM software. This will benefit customers who are looking to accelerate their go-to-market plan or to scale their robotic fleet aggressively.</p>
<div id="attachment_1604" style="width: 949px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1604" loading="lazy" class="wp-image-1604 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic4_Movel-AI.png?resize=939%2C458&#038;ssl=1" alt="" width="939" height="458" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic4_Movel-AI.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic4_Movel-AI.png?resize=300%2C146&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic4_Movel-AI.png?resize=768%2C375&amp;ssl=1 768w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /><p id="caption-attachment-1604" class="wp-caption-text">Figure 3: Kudan’s 3D-Lidar SLAM (Reference video : <a href="https://youtu.be/JVXS6q2KoGE" target="_blank" rel="noopener">https://youtu.be/JVXS6q2KoGE</a>)</p></div>
<div id="attachment_1605" style="width: 949px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1605" loading="lazy" class="wp-image-1605 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic5_Movel-AI.png?resize=939%2C427&#038;ssl=1" alt="" width="939" height="427" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic5_Movel-AI.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic5_Movel-AI.png?resize=300%2C136&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2023/03/Pic5_Movel-AI.png?resize=768%2C349&amp;ssl=1 768w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /><p id="caption-attachment-1605" class="wp-caption-text">Figure 4: Kudan’s Visual SLAM (Reference video : <a href="https://youtu.be/wm4M_jW4zyc" target="_blank" rel="noopener">https://youtu.be/wm4M_jW4zyc</a>)</p></div>
<p>&#8220;We are thrilled to partner with Movel AI and witness the commercial release of this integrated solution leveraging our market-leading SLAM technologies.&#8221; said Daiu Ko, CEO of Kudan. &#8220;With our collaboration, we look forward to leveraging the business footprint of both companies in different regions and helping AMR vendors in the global market grow their deployment base with enhanced performance and operational efficiency enabled by our joint solution.”</p>
<p>&#8220;At Movel AI, we aim to speed up robot adoption across industries. Our partnership with Kudan is a crucial step towards achieving this vision, while simultaneously advancing robotics companies to leverage enhanced navigation experiences.” said Abhishek Gupta, CEO of Movel AI. &#8220;Through collaborating with Kudan, we are confident in delivering a greater value by leveraging our expertise to offer innovative navigation solutions to the robotics market. This partnership is a significant milestone for both companies, and we look forward to the opportunities it presents.”</p>
<p>The joint solution has already generated the interest of multiple robotics companies, which are now currently testing it in different phases and progressing towards commercial site deployment. Seirios’ technology will enable everyone, even non-tech users, to enjoy the performance capabilities and experience a seamless robot navigation. Furthermore, it can also be scaled efficiently across robotic fleets, allowing robot fleet managers to run and delegate tasks to up to 100 robots with one single system.</p>
<p>To find out more about this all-in-one solution or to try a 15-day trial, please reach out to us via <a href="https://www.kudan.io/contact" target="_blank" rel="noopener">Kudan</a> or <a href="https://movel.ai/" target="_blank" rel="noopener">Movel AI</a>.</p>
<p><span style="text-decoration: underline;"><strong>About Movel AI</strong></span><br />
Movel AI is a robotics software, deep tech startup based in Singapore. Movel AI delivers human-like precision and movements to robots; combining sensor fusion, vision and machine learning &amp; artificial intelligence technologies. Movel AI’s solutions are tailored to customers across different industries, applications and needs; from AGVs (automated guided vehicles) traversing factory grounds for logistical movements, to inspection robots scanning structural defects within multi-storey buildings, to enabling automated navigation in personal mobility devices. For more information, please refer to Movel AI’s website at <a href="http://www.movel.ai" target="_blank" rel="noopener">www.movel.ai</a>.</p>
<p><span style="text-decoration: underline;"><strong>About Kudan Inc.</strong></span><br />
Kudan is a deep tech research and development company specializing in algorithms for artificial perception (AP). As a complement to artificial intelligence (AI), AP functions allow machines to develop autonomy. Currently, Kudan is using its high-level technical innovation to explore business areas based on its own milestone models established for deep tech which provide wide-ranging impact on several major industrial fields.<br />
For more information, please refer to Kudan’s website at <a href="https://www.kudan.io/" target="_blank" rel="noopener noreferrer">https://www.kudan.io/</a>.</p>
<p>■Company Details<br />
Name: Kudan Inc.<br />
Securities Code: 4425 (TSE Growth)<br />
Representative: CEO Daiu Ko</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener noreferrer">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/kudan-movel-partnership-commercial-milestone-2023/">Kudan’s strategic partner Movel AI starts commercial offering for integrated solution for AMR with Kudan SLAM</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1599</post-id>	</item>
		<item>
		<title>Kudan 3D-Lidar SLAM (KdLidar) in Action: Map Streaming from the Cloud</title>
		<link>https://www.kudan.io/blog/kdlidar-in-action-map-streaming-from-the-cloud/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kdlidar-in-action-map-streaming-from-the-cloud</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Wed, 30 Nov 2022 09:00:08 +0000</pubDate>
				<category><![CDATA[Tech Blog]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[map streaming]]></category>
		<category><![CDATA[tech blog]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1571</guid>

					<description><![CDATA[<p>This time, let us share our “map streaming from the cloud” feature in Kudan 3D-Lidar SLAM. This feature is quite powerful for autonomous driving and autonomous mobile robot applications. Now users can use this feature combined with cloud infrastructure. Therefore, users can store just a single large map on the cloud and stream down a [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/kdlidar-in-action-map-streaming-from-the-cloud/">Kudan 3D-Lidar SLAM (KdLidar) in Action: Map Streaming from the Cloud</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>This time, let us share our “map streaming from the cloud” feature in Kudan 3D-Lidar SLAM. This feature is quite powerful for autonomous driving and autonomous mobile robot applications.<br />
Now users can use this feature combined with cloud infrastructure. Therefore, users can store just a single large map on the cloud and stream down a necessary part of the map to multiple vehicles and robots. Major benefits are</p>
<ul>
<li>Users can update the single map and vehicles/ robots can get the up-to-date map easily (very efficient map update)</li>
<li>No need to store a large map on each vehicle, which leads to significant cutbacks on required on-board disk and memory size.</li>
</ul>
<p><img loading="lazy" class="aligncenter wp-image-1572 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/11/Pic1_blog.png?resize=939%2C635&#038;ssl=1" alt="" width="939" height="635" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/11/Pic1_blog.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/11/Pic1_blog.png?resize=300%2C203&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/11/Pic1_blog.png?resize=768%2C519&amp;ssl=1 768w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /></p>
<p>Here is the demo video &#8211;<strong> Kudan 3D-Lidar SLAM in Action: Map Streaming from the Cloud</strong></p>
<p><iframe loading="lazy" title="Kudan 3D-Lidar SLAM (KdLidar) in Action: Map Streaming from the Cloud" width="500" height="281" src="https://www.youtube.com/embed/Y8ov4yHvL4Q?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
<p>The demo video uses KITTI dataset and showcases that 2 different vehicles keep localizing on a map streamed down from the cloud. Each vehicle loads only a portion of the original map based on its location to minimize memory usage. Thanks to our map compressed several hundred times lighter than the original raw data, the normal 3G/4G network is sufficient for keeping stable map loading.</p>
<p>Please stay tuned for further updates!</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener noreferrer">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/kdlidar-in-action-map-streaming-from-the-cloud/">Kudan 3D-Lidar SLAM (KdLidar) in Action: Map Streaming from the Cloud</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1571</post-id>	</item>
		<item>
		<title>Kudan 3D-Lidar SLAM (KdLidar) in Action: Vehicle-Based Mapping in an Urban area</title>
		<link>https://www.kudan.io/blog/kdlidar-in-action-vehicle-based-mapping-in-an-urban-area/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kdlidar-in-action-vehicle-based-mapping-in-an-urban-area</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Fri, 28 Oct 2022 09:00:58 +0000</pubDate>
				<category><![CDATA[Tech Blog]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[Lidar]]></category>
		<category><![CDATA[mapping]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[SLAM]]></category>
		<category><![CDATA[tech blog]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1514</guid>

					<description><![CDATA[<p>It’s been a while since our last “Kudan 3D-Lidar SLAM (KdLidar) in action” series. Of course, we have been working on fascinating projects and powerful features (and just came back from InterGeo in Germany!) This time, our demo showcases the result in one of the most typical environments and setups. Recorded in an urban area [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/kdlidar-in-action-vehicle-based-mapping-in-an-urban-area/">Kudan 3D-Lidar SLAM (KdLidar) in Action: Vehicle-Based Mapping in an Urban area</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>It’s been a while since our last “Kudan 3D-Lidar SLAM (KdLidar) in action” series. Of course, we have been working on fascinating projects and powerful features (and just came back from InterGeo in Germany!)</p>
<p><img loading="lazy" class="aligncenter wp-image-1515 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/Blog-pic.png?resize=939%2C325&#038;ssl=1" alt="" width="939" height="325" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/Blog-pic.png?w=939&amp;ssl=1 939w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/Blog-pic.png?resize=300%2C104&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/Blog-pic.png?resize=768%2C266&amp;ssl=1 768w" sizes="(max-width: 939px) 100vw, 939px" data-recalc-dims="1" /></p>
<p>This time, our demo showcases the result in one of the most typical environments and setups.</p>
<ul>
<li>Recorded in an urban area (station square of a large train station in Japan) with some buildings around and roofed areas, which prevents good GNSS signals</li>
<li>Recorded with a vehicle-base sensor setup with a tilted 3D-Lidar with INS</li>
</ul>
<p>Here is the demo video &#8211; <strong>Kudan 3D-Lidar SLAM in Action: Vehicle Mobile Mapping in an Urban Area</strong></p>
<p><iframe loading="lazy" title="Kudan 3D-Lidar SLAM in Action: Vehicle Mobile Mapping in an Urban Area" width="500" height="281" src="https://www.youtube.com/embed/nuBxLaVuaKA?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
<p>There’s nothing much to say about this video since it’s quite straightforward and the video explains itself. However, the main benefits you get from KdLidar are …</p>
<ol>
<li><strong>Up to &lt;1cm accuracy in various environments</strong>: Tested and proven with multiple geospatial partners to perform with up to &lt;1cm accuracy both indoors and outdoors</li>
<li><strong>Flexible sensor choices</strong>: Various sensors (3D Lidar, IMU, and INS) tested and supported</li>
<li><strong>Faster time-to-market</strong>: Much faster time-to-market than internal development or other SLAM software (We have a team of 30 SLAM engineers)</li>
</ol>
<p>Please stay tuned for further updates!</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener noreferrer">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/kdlidar-in-action-vehicle-based-mapping-in-an-urban-area/">Kudan 3D-Lidar SLAM (KdLidar) in Action: Vehicle-Based Mapping in an Urban area</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1514</post-id>	</item>
		<item>
		<title>Kudan to exhibit at INTERGEO 2022</title>
		<link>https://www.kudan.io/blog/kudan-to-exhibit-at-intergeo-2022/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kudan-to-exhibit-at-intergeo-2022</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Fri, 07 Oct 2022 09:00:45 +0000</pubDate>
				<category><![CDATA[Press Release]]></category>
		<category><![CDATA[3D-Lidar SLAM]]></category>
		<category><![CDATA[exhibition]]></category>
		<category><![CDATA[INTERGEO 2022]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[SLAM]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1477</guid>

					<description><![CDATA[<p>Kudan Inc. (headquartered in Shibuya-ku, Tokyo; CEO Daiu Ko, hereafter “Kudan”), a leading provider of Artificial Perception / SLAM technology across a variety of applications, is excited to announce that Kudan will be showcasing its Simultaneous Localization and Mapping (SLAM) solutions during the INTERGEO conference that will take place 18-20 October at the Trade Fair [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/kudan-to-exhibit-at-intergeo-2022/">Kudan to exhibit at INTERGEO 2022</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" class="aligncenter wp-image-1478 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/pic1-1.png?resize=872%2C493&#038;ssl=1" alt="" width="872" height="493" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/pic1-1.png?w=872&amp;ssl=1 872w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/pic1-1.png?resize=300%2C170&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/10/pic1-1.png?resize=768%2C434&amp;ssl=1 768w" sizes="(max-width: 872px) 100vw, 872px" data-recalc-dims="1" /></p>
<p>Kudan Inc. (headquartered in Shibuya-ku, Tokyo; CEO Daiu Ko, hereafter “Kudan”), a leading provider of Artificial Perception / SLAM technology across a variety of applications, is excited to announce that Kudan will be showcasing its Simultaneous Localization and Mapping (SLAM) solutions during the INTERGEO conference that will take place 18-20 October at the Trade Fair Centre Messe in Essen, Germany.</p>
<p>Intergeo’s 2022 focus will be on showcasing various developments in the Digital Twins: 3D city models, BIM in surveying and digital urban planning areas. In line with this focus, Kudan will be showcasing its 3D Lidar SLAM, ‘KdLidar’, a solution which can be used for inspection, survey, progress monitoring and more.</p>
<p>Visitors at the Kudan booth would be able to view the different vehicle and handheld based mapping kits running on our KdLidar software. These mapping kits have been built to provide customers with a solution that is easy-to-use, cost-effective while providing high performance.</p>
<p>Conference participants will be able to experience a real-time demonstration of our KdLidar mapping kits and its capabilities. We look forward to meeting you there!</p>
<p><span style="text-decoration: underline;"><strong>Event details</strong></span></p>
<ul>
<li>Event: INTERGEO</li>
<li>Date: Tuesday, 18 October to Thursday, 20 October, 2022</li>
<li>Location: Trade Fair Center Messe Essen, Germany</li>
<li>Booth: Hall 2, C2.002</li>
</ul>
<p><strong>About Kudan Inc.</strong><br />
Kudan (Tokyo Stock Exchange securities code: 4425) is a deep tech research and development company specializing in algorithms for artificial perception (AP). As a complement to artificial intelligence (AI), AP functions allow machines to develop autonomy. Currently, Kudan is using its high-level technical innovation to explore business areas based on its own milestone models established for deep tech which provide wide-ranging impact on several major industrial fields.<br />
For more information, please refer to Kudan’s website at <a href="https://www.kudan.io/" target="_blank" rel="noopener noreferrer">https://www.kudan.io/</a>.</p>
<p>■Company Details<br />
Name: Kudan Inc.<br />
Securities Code: 4425<br />
Representative: CEO Daiu Ko</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener noreferrer">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/kudan-to-exhibit-at-intergeo-2022/">Kudan to exhibit at INTERGEO 2022</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1477</post-id>	</item>
		<item>
		<title>How to Set up Sensors In a 3D-Lidar SLAM-Friendly Way</title>
		<link>https://www.kudan.io/blog/how-to-set-up-sensors-for-3d-lidar-slam/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=how-to-set-up-sensors-for-3d-lidar-slam</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Wed, 24 Aug 2022 08:30:55 +0000</pubDate>
				<category><![CDATA[Tech Blog]]></category>
		<category><![CDATA[3D-Lidar SLAM]]></category>
		<category><![CDATA[accuracy]]></category>
		<category><![CDATA[KdLidar]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[Kudan SLAM]]></category>
		<category><![CDATA[performance]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[SLAM]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1384</guid>

					<description><![CDATA[<p>Optimizing the SLAM system for the best performance isn&#8217;t easy. It requires an in-depth understanding of the algorithm, selecting the well-suited sensors based on the intended use case, tuning the SLAM parameters for optimal performance, and so on. We have written in-depth tutorials and articles on the topics above for 3D-Lidar SLAM, and we&#8217;ll leave [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/how-to-set-up-sensors-for-3d-lidar-slam/">How to Set up Sensors In a 3D-Lidar SLAM-Friendly Way</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" class="aligncenter wp-image-1385 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/kudan-slam-friendly-1024x683.png?resize=1024%2C683&#038;ssl=1" alt="" width="1024" height="683" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/kudan-slam-friendly.png?resize=1024%2C683&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/kudan-slam-friendly.png?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/kudan-slam-friendly.png?resize=768%2C512&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/kudan-slam-friendly.png?w=1500&amp;ssl=1 1500w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /></p>
<p>Optimizing the SLAM system for the best performance isn&#8217;t easy. It requires an in-depth understanding of the algorithm, selecting the well-suited sensors based on the intended use case, tuning the SLAM parameters for optimal performance, and so on.</p>
<p>We have written in-depth tutorials and articles on the topics above for 3D-Lidar SLAM, and we&#8217;ll leave the links to each at the end of the article if you&#8217;re interested in knowing more. However, even if you do all of these steps correctly, if the sensors are not set up properly, the setup will likely cause an unnecessary challenge for SLAM, and its performance will deteriorate.</p>
<p>In this article, we want to shed some light on common mistakes of the SLAM setup, what&#8217;s wrong with them, and how to fix them. Please pay close attention because we&#8217;ll tell you everything you need to maximize the 3D- Lidar SLAM performance.</p>
<hr />
<h2><strong>Lidar tilting can be challenging to the 3D-Lidar SLAM accuracy</strong></h2>
<p>Generally, the more you tilt the lidar, the more challenging it gets to run the SLAM system accurately (especially in outdoor scenarios).</p>
<p>Some applications do require you to tilt the lidar:</p>
<ul>
<li>Road maintenance mapping applications benefit from lidar tilting as it helps to obtain more points from the road surface shown in figure 2.</li>
<li>Indoor applications where the roof also needs to be detected benefit from a 90-degree tilt.</li>
<li>However, the loop closure gets more difficult in these scenarios, and the overall robustness deteriorates.</li>
</ul>
<div id="attachment_1387" style="width: 1034px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1387" loading="lazy" class="wp-image-1387 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Screen-Shot-2022-08-03-at-23.31.23-1024x299.png?resize=1024%2C299&#038;ssl=1" alt="" width="1024" height="299" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Screen-Shot-2022-08-03-at-23.31.23.png?resize=1024%2C299&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Screen-Shot-2022-08-03-at-23.31.23.png?resize=300%2C88&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Screen-Shot-2022-08-03-at-23.31.23.png?resize=768%2C224&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Screen-Shot-2022-08-03-at-23.31.23.png?w=1500&amp;ssl=1 1500w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /><p id="caption-attachment-1387" class="wp-caption-text"><em>Figure 2: Illustration of horizontal and tilted lidar</em></p></div>
<p>The critical point is understanding the trade-off between lidar tilting and SLAM performance. We also have a demo video showing our 3D-Lidar SLAM working with a 45-degree tilted lidar (Link at the end). However, as shown in figure 3, it demonstrates that there is a wide difference of how well the lidar can capture surrounding objects and structures between these different lidar mounting, which eventually affect 3D-Lidar SLAM performance.</p>
<div id="attachment_1386" style="width: 1034px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1386" loading="lazy" class="wp-image-1386 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Lidar-view-1024x292.png?resize=1024%2C292&#038;ssl=1" alt="" width="1024" height="292" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Lidar-view.png?resize=1024%2C292&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Lidar-view.png?resize=300%2C86&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Lidar-view.png?resize=768%2C219&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Lidar-view.png?resize=1536%2C439&amp;ssl=1 1536w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/Lidar-view.png?w=1845&amp;ssl=1 1845w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /><p id="caption-attachment-1386" class="wp-caption-text"><em>Figure 3: Comparison of output between horizontal, tilted, and vertical lidar</em></p></div>
<p>It&#8217;s best to understand the specific use case, analyze the trade-off and pick what works best for you.</p>
<h2><strong>What you need to know about the positioning between IMU and the Lidar</strong></h2>
<p>Multi-sensor fusion is a versatile and valuable approach for SLAM since it helps us realize the high-precision [1]. However, when you fuse additional sensors with the 3D-Lidar, the extrinsic calibration must be accurate between the multiple sensors.</p>
<p>The sensors are mounted differently to each other, having a translational and rotation position relationship between each other. The rule of thumb is that keeping the sensors as close as possible reduces the risk of having more significant extrinsic errors, especially Lidar and IMU, because it&#8217;s easier to measure the accurate extrinsic.</p>
<p>For example, if IMU is a few centimeters from the lidar, you&#8217;ll have an error in the order of millimeters. If the IMU is several meters from the lidar, you&#8217;ll likely have an error in the order of centimeters. Ideally, you&#8217;d want to keep your IMU or INS just under your 3D-Lidar.</p>
<h2><strong>Position of Lidar and GNSS respect to each other could affect the performance negatively</strong></h2>
<p>If GNSS antenna is in the Lidar field of view, it partially blocks the lidar to capture surroundings. Similarly, the lidar can block the view of GNSS antenna when the lidar is above the sky dome of the antenna. This will make it harder for the GNSS to see enough number of satellites to get reliable signals. Hence, keeping these two pieces of equipment close is not recommended to minimize interferences.</p>
<p>On the other hand, as we saw earlier with the case of IMU and lidar, achieving an excellent extrinsic calibration is easier when they are kept close. Our challenge is maintaining a good balance such that the interference is minimized while the extrinsic calibration error is acceptable.</p>
<h2><strong>Time synchronization between multiple sensor data</strong></h2>
<p>We&#8217;ve seen the lack of time synchronization as the source of error for deteriorated SLAM performance many times.</p>
<p>The data from multiple sensors should have time stamps on the same clock, i.e., the t=0 timestamp of each sensor data should be precisely at the same time. Please pay extra attention to this factor in your setup, as this may lead to a nun-functional SLAM system if not addressed correctly.</p>
<p>To demonstrate the effect further, let’s look at an internal dataset we have at Kudan. We detected a 3.5s time difference between the lidar and the GNSS dataset &#8211; indicating the two sensors were out-of-sync. Figure 4 pictures the point cloud you’d get when such instances occur. The green line is the GNSS trajectory, the purple is Lidar-only-SLAM, and the red lines indicate the differences. Double edges and blurry lines can easily be noticed in the point cloud.</p>
<div id="attachment_1388" style="width: 893px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1388" loading="lazy" class="wp-image-1388 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/pic1.png?resize=883%2C502&#038;ssl=1" alt="" width="883" height="502" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/pic1.png?w=883&amp;ssl=1 883w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/pic1.png?resize=300%2C171&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/08/pic1.png?resize=768%2C437&amp;ssl=1 768w" sizes="(max-width: 883px) 100vw, 883px" data-recalc-dims="1" /><p id="caption-attachment-1388" class="wp-caption-text"><em>Figure 4: Example of point cloud obtained with sensors data when unsynchronized </em></p></div>
<h2><strong>Per-point time stamp: Things you need to understand</strong></h2>
<p>When a 3D-Lidar emits lasers, each point is emitted at a slightly different time, even if it&#8217;s in the same scan cycle.</p>
<p>This may not sound very clear to you, but let&#8217;s take an example to explain it better. In a 360-degree scanning lidar, a single scan takes 100ms for the 10Hz mode. This means that there is approximately a 100ms time difference between the first and the last point of the scan. Assuming that these two timestamps are the same is incorrect and is a known source of error.</p>
<p>In the cases of dynamic movement, sensor movement between lidar frames is generally non-linear. In these scenarios modeling individual timestamps can be challenging without measuring the movement itself. IMUs are extremely useful for this purpose [2].</p>
<p>The overarching idea is that when you set up sensors and collect data, it&#8217;s vital ensuring that a timestamp is obtained for each lidar point.</p>
<hr />
<h2><strong>Final words</strong></h2>
<p>At Kudan, through our blog, we&#8217;ve aimed to distill the information and present it as simply as possible on all things related to SLAM.</p>
<p>Here are some of our previous articles on 3D Lidar SLAM:</p>
<ul>
<li style="font-weight: 400;" aria-level="1"><a href="https://www.kudan.io/blog/3d-lidar-slam-the-basics/" target="_blank" rel="noopener">3D Lidar SLAM: The Basics</a></li>
<li style="font-weight: 400;" aria-level="1"><a href="https://www.kudan.io/blog/how-to-select-the-best-3d-lidar-for-slam/" target="_blank" rel="noopener">How to Select the Best 3D Lidar for SLAM</a></li>
<li style="font-weight: 400;" aria-level="1"><a href="https://www.kudan.io/blog/how-to-tune-3d-lidar-slam-parameters/" target="_blank" rel="noopener">How to Tune 3D-Lidar SLAM Parameters</a></li>
</ul>
<p>In today&#8217;s article, we went a step further and presented a few tips on how you can set up the sensors in a 3D-Lidar SLAM-friendly way and the trade-offs involved with setting them up otherwise.</p>
<p>We hope these tips helped avoid common blunders and squeeze out the best performance for your SLAM systems. If you&#8217;re interested to know more or have questions in your mind specific to your use case, think no further but <a href="https://www.kudan.io/contact/">say hi to us</a> — they&#8217;ll be our questions to answer.</p>
<hr />
<h2><strong>References</strong></h2>
<p>[1] Du, T., Zeng, Y.H., Yang, J., Tian, C.Z. and Bai, P.F. (2020), Multi-sensor fusion SLAM approach for the mobile robot with a bio-inspired polarised skylight sensor. IET Radar Sonar Navig., 14: 1950–1957. [<a href="https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/iet-rsn.2020.0260">PDF</a>]</p>
<p>[2] Karam, Samer &amp; Lehtola, Ville &amp; Vosselman, George. (2020). STRATEGIES TO INTEGRATE IMU AND LIDAR SLAM FOR INDOOR MAPPING. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences. V-1–2020. 223–230. [<a href="https://www.researchgate.net/publication/343402716_STRATEGIES_TO_INTEGRATE_IMU_AND_LIDAR_SLAM_FOR_INDOOR_MAPPING/fulltext/5f284683a6fdcccc43a87aac/STRATEGIES-TO-INTEGRATE-IMU-AND-LIDAR-SLAM-FOR-INDOOR-MAPPING.pdf">PDF</a>]</p><p>The post <a href="https://www.kudan.io/blog/how-to-set-up-sensors-for-3d-lidar-slam/">How to Set up Sensors In a 3D-Lidar SLAM-Friendly Way</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1384</post-id>	</item>
		<item>
		<title>Kudan releases its latest update on 3D-Lidar SLAM powered by the NVIDIA Jetson platform with up to 120% acceleration result</title>
		<link>https://www.kudan.io/blog/kudan-3d-lidar-slam-accelerated-on-nvidia-jetson-platform/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kudan-3d-lidar-slam-accelerated-on-nvidia-jetson-platform</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 28 Jul 2022 06:00:40 +0000</pubDate>
				<category><![CDATA[Press Release]]></category>
		<category><![CDATA[3D-Lidar SLAM]]></category>
		<category><![CDATA[acceleration]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[NVIDIA]]></category>
		<category><![CDATA[NVIDIA Jetson platform]]></category>
		<category><![CDATA[NVIDIA Partner Network]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1345</guid>

					<description><![CDATA[<p>Kudan Inc. (headquartered in Shibuya-ku, Tokyo; CEO Daiu Ko, hereafter “Kudan”) is pleased to announce that Kudan is releasing a new version of 3D-Lidar SLAM that is optimized for the NVIDIA Jetson edge AI and robotics platform. Kudan is a Preferred Partner in the NVIDIA Partner Network, working on autonomous mobile robot (AMR) applications with [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/kudan-3d-lidar-slam-accelerated-on-nvidia-jetson-platform/">Kudan releases its latest update on 3D-Lidar SLAM powered by the NVIDIA Jetson platform with up to 120% acceleration result</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" class="aligncenter wp-image-1346 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic1.png?resize=885%2C335&#038;ssl=1" alt="" width="885" height="335" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic1.png?w=885&amp;ssl=1 885w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic1.png?resize=300%2C114&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic1.png?resize=768%2C291&amp;ssl=1 768w" sizes="(max-width: 885px) 100vw, 885px" data-recalc-dims="1" /></p>
<p>Kudan Inc. (headquartered in Shibuya-ku, Tokyo; CEO Daiu Ko, hereafter “Kudan”) is pleased to announce that Kudan is releasing a new version of 3D-Lidar SLAM that is optimized for the NVIDIA Jetson edge AI and robotics platform.</p>
<p>Kudan is a Preferred Partner in the NVIDIA Partner Network, working on autonomous mobile robot (AMR) applications with both Visual SLAM and 3D-Lidar SLAM. While a number of companies are developing their autonomous mobile machines using Kudan’s SLAM software and the NVIDIA Jetson platform, this update is designed to help customers take full advantage of the Jetson platform’s hardware capabilities, and Kudan’s advanced lidar SLAM software.</p>
<p>Kudan’s 3D-Lidar SLAM accelerated on NVIDIA Jetson brings significant value to AMR applications by freeing up processing capacity for other resource-intensive tasks and increasing overall stability. Thanks to the continuing 3D-Lidar price reductions and overall performance improvements, more and more robotics OEMs are evaluating 3D-Lidar SLAM for their localization requirements. However, given that some of the sensors generate more than 1 million points per second, processing SLAM with this amount of data in real-time can be a challenge. For academic and research purposes, it is possible to use a high-power computer to process 3D-Lidar SLAM but the cost can be prohibitive for large-scale production and adoption.</p>
<p>Kudan leveraged NVIDIA’s math functions optimized for its CUDA architecture in the 3D-Lidar SLAM’s pose estimation module to process data more efficiently on the GPU. This not only accelerates the overall processing time but also offloads a considerable portion of the processing from the CPU. Kudan tested its performance on the NVIDIA Jetson AGX Xavier module using KITTI datasets, one of the most well-known open datasets for testing 3D-Lidar SLAM, and observed 120% acceleration compared to using only CPUs and 80% acceleration on a desktop system.</p>
<p><img loading="lazy" class="aligncenter wp-image-1347 size-full" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic2.png?resize=885%2C318&#038;ssl=1" alt="" width="885" height="318" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic2.png?w=885&amp;ssl=1 885w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic2.png?resize=300%2C108&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/NVIDIA_pic2.png?resize=768%2C276&amp;ssl=1 768w" sizes="(max-width: 885px) 100vw, 885px" data-recalc-dims="1" /></p>
<p>Kudan sees more acceleration opportunities for 3D-Lidar SLAM and also plans to implement similar acceleration for Visual SLAM on NVIDIA CUDA architecture. This series of efforts will benefit many robotics OEMs and users. Also, creating digital twins and accurate 3D point clouds in geospatial and general mapping applications are increasingly in high demand, where NVIDIA GPU-accelerated computing and Kudan 3D-Lidar SLAM shine together through NVIDIA’s processing capability and Kudan’s SLAM performance.</p>
<p>Please access <a href="https://www.kudan.io/blog/kudan-joins-nvidia-partner-network-to-expand-slam-opportunities/" target="_blank" rel="noopener">this link</a> for our previous release about Kudan’s relationship with NVIDIA.</p>
<p><strong>About Kudan Inc.</strong><br />
Kudan (Tokyo Stock Exchange securities code: 4425) is a deep tech research and development company specializing in algorithms for artificial perception (AP). As a complement to artificial intelligence (AI), AP functions allow machines to develop autonomy. Currently, Kudan is using its high-level technical innovation to explore business areas based on its own milestone models established for deep tech which provide wide-ranging impact on several major industrial fields.<br />
For more information, please refer to Kudan’s website at <a href="https://www.kudan.io/" target="_blank" rel="noopener noreferrer">https://www.kudan.io/</a>.</p>
<p>■Company Details<br />
Name: Kudan Inc.<br />
Securities Code: 4425<br />
Representative: CEO Daiu Ko</p>
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener noreferrer">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/kudan-3d-lidar-slam-accelerated-on-nvidia-jetson-platform/">Kudan releases its latest update on 3D-Lidar SLAM powered by the NVIDIA Jetson platform with up to 120% acceleration result</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1345</post-id>	</item>
		<item>
		<title>How to Tune 3D-Lidar SLAM Parameters</title>
		<link>https://www.kudan.io/blog/how-to-tune-3d-lidar-slam-parameters/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=how-to-tune-3d-lidar-slam-parameters</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Wed, 13 Jul 2022 07:00:16 +0000</pubDate>
				<category><![CDATA[Tech Blog]]></category>
		<category><![CDATA[3D-Lidar SLAM]]></category>
		<category><![CDATA[Kudan]]></category>
		<category><![CDATA[Kudan 3D-Lidar SLAM]]></category>
		<category><![CDATA[KudanSLAM]]></category>
		<category><![CDATA[mapping]]></category>
		<category><![CDATA[robotics]]></category>
		<category><![CDATA[Simultaneous Localization and Mapping]]></category>
		<category><![CDATA[voxel]]></category>
		<guid isPermaLink="false">https://www.kudan.io/?p=1330</guid>

					<description><![CDATA[<p>Do you want to implement 3D-Lidar SLAM for your use case successfully? Have you heard of the 3D-Lidar SLAM but are unsure how to maximize its performance? Do you find tweaking the 3D-Lidar SLAM parameters complex and confusing? If you answered yes to any of these questions, you would find a lot of value in [&#8230;]</p>
<p>The post <a href="https://www.kudan.io/blog/how-to-tune-3d-lidar-slam-parameters/">How to Tune 3D-Lidar SLAM Parameters</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" class="aligncenter wp-image-1334 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Picture1-1024x473.png?resize=1024%2C473&#038;ssl=1" alt="" width="1024" height="473" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Picture1.png?resize=1024%2C473&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Picture1.png?resize=300%2C139&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Picture1.png?resize=768%2C355&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Picture1.png?resize=1536%2C710&amp;ssl=1 1536w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Picture1.png?w=1595&amp;ssl=1 1595w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /></p>
<p>Do you want to implement 3D-Lidar SLAM for your use case successfully?</p>
<p>Have you heard of the 3D-Lidar SLAM but are unsure how to maximize its performance?</p>
<p>Do you find tweaking the 3D-Lidar SLAM parameters complex and confusing?</p>
<p>If you answered yes to any of these questions, you would find a lot of value in this article. Once you’ve decided to use 3D-Lidar SLAM for your business use case, you will need to tick off these three checkboxes:</p>
<ol>
<li>Pick an appropriate 3D-Lidar sensor unit</li>
<li>Decide on a proper 3D-Lidar SLAM approach</li>
<li>Choose the relevant parameters for the SLAM system</li>
</ol>
<p>If you are unsure of the first two checkboxes, we’ve written about the 3D-Lidar SLAM approach and how you can select the appropriate sensor for your use case. We’ll leave the links at the end of this article for you to delve deeper if you haven’t already.</p>
<p>In this article, we’ll share some descriptions, typical values, and high-level guidelines on the 4 main parameters, which commonly are included in many 3D-Lidar SLAM approaches. We’ll also talk about how these parameters may impact the overall performance of the system.</p>
<hr />
<h2>Voxel sizes and why they matter</h2>
<p><strong>Smaller voxel size improves robustness and accuracy but has a price to pay</strong></p>
<p>For a SLAM system, we treat the 3D space as a group of small 3D spaces known as <a href="https://en.wikipedia.org/wiki/Voxel" target="_blank" rel="noopener">voxels</a>. <strong>Voxel size</strong> indicates how granular or coarse you want to dissect the 3D space. In other words, what dimension should a voxel be?</p>
<p>In theory, a smaller voxel size means improved robustness and accuracy even though it saturates at some point. This is because when the voxel size is small, the SLAM system dissects the 3D space into many smaller 3D spaces and thus has been able to extract more points for tracking and mapping. This ability to capture more points allows for more points to track against when moving, resulting in greater robustness and accuracy.</p>
<p>However, when the system has more points to process for tracking and mapping, the system is inevitably slower, introducing the processing time vs. accuracy trade-off.</p>
<p>So when should you lower the voxel size? We recommend using a smaller voxel size if you are in a small indoor place near because the points concentrate in a small 3D space and it’s not possible to have meaningful number of points if voxel size is large. Also, we recommend to do so for mapping applications when real-time processing isn’t required.</p>
<p>Here’s what typical voxel sizes in meters may look like:</p>
<div id="attachment_1335" style="width: 1034px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1335" loading="lazy" class="wp-image-1335 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Voxel-size-1024x492.png?resize=1024%2C492&#038;ssl=1" alt="" width="1024" height="492" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Voxel-size.png?resize=1024%2C492&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Voxel-size.png?resize=300%2C144&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Voxel-size.png?resize=768%2C369&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Voxel-size.png?w=1206&amp;ssl=1 1206w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /><p id="caption-attachment-1335" class="wp-caption-text"><em>Figure 2: Typical values for voxel size (in meter)</em></p></div>
<p>As a thumb of the rule, we would start from 1.0m for outdoor robotics, 0.5m for large indoor robotics, and 0.3m for small indoor (e.g., office space) robotics. As you may have noticed, the parameters are relatively low for mapping to gain more accuracy in sacrifice of processing speed in general.</p>
<p>However, in the case that your SLAM system doesn’t track well with the suggested voxel size, we would recommend trying a smaller voxel size. Conversely, if your SLAM system cannot catch up with real-time feed, try a larger voxel size to reduce the number of points used.</p>
<p>The key is understanding the impact of the voxel size parameter to inform the decision on changing the value as required, which you do now.</p>
<hr />
<h2><strong>Understand the maximum distance for matching the current frame to the map</strong></h2>
<p>This parameter indicates how far you want to use an existing keyframe in the Iterative Closest Point [1] process to determine the current pose instead of creating a new one.</p>
<p>In other words, this parameter determines how frequently you want to create a new keyframe. The longer the distance you set, the less frequent the system creates a new keyframe.</p>
<p>Some typical values (in meters) are given in the image below:</p>
<div id="attachment_1331" style="width: 1034px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1331" loading="lazy" class="wp-image-1331 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Maximum-distance-to-match-1024x427.png?resize=1024%2C427&#038;ssl=1" alt="" width="1024" height="427" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Maximum-distance-to-match.png?resize=1024%2C427&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Maximum-distance-to-match.png?resize=300%2C125&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Maximum-distance-to-match.png?resize=768%2C320&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Maximum-distance-to-match.png?w=1389&amp;ssl=1 1389w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /><p id="caption-attachment-1331" class="wp-caption-text"><em>Figure 3: Typical values for maximum distance for matching frames to map (in meters)</em></p></div>
<p>For indoor applications, we set smaller values since we’d like to have more frequent keyframes than our outdoor use cases to detect movements more precisely.</p>
<hr />
<h2><strong>Pay attention to the voxel size when deciding on the minimum matched points to track</strong></h2>
<p>The parameter “minimum number of matched points to track” sets the threshold to decide whether or not the tracking has failed.</p>
<p>When the number of matched points between the keyframes and the current frame is below this threshold value, the SLAM system is said to be ‘lost’.</p>
<p>Intuitively, you can say that the smaller the number of matched points that are required to track, the easier it is for the SLAM system to continue tracking without getting lost. However, setting a too small number introduces false tracking and accumulates larger drift.</p>
<p>It’s essential to consider the voxel size you have already set in order to select the appropriate value for this parameter. If you’ve set a larger voxel size, you’d have fewer points for matching; the minimum matched points needs to be set lower.</p>
<p>The number below is the example for our SLAM system which uses absolute values, but it could be relative numbers or percentages as well.</p>
<div id="attachment_1333" style="width: 1034px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1333" loading="lazy" class="wp-image-1333 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-track-1024x375.png?resize=1024%2C375&#038;ssl=1" alt="" width="1024" height="375" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-track.png?resize=1024%2C375&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-track.png?resize=300%2C110&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-track.png?resize=768%2C281&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-track.png?w=1187&amp;ssl=1 1187w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /><p id="caption-attachment-1333" class="wp-caption-text"><em>Figure 4: Typical values for the minimum number of points to track</em></p></div>
<p>If the environment has rich structures, set a higher value while reducing it when there are fewer objects around the lidar (e.g., a fairly open field with several trees). Thus, indoor environments typically have higher values than outdoor environments.</p>
<hr />
<h2><strong>Setting the minimum number of match points to relocalize</strong></h2>
<p>The parameter “minimum number of points to relocalize” sets a threshold to decide if the initial position on the map is identified.</p>
<p>When the number of matched points between the keyframes and the current frame is above the given threshold value, the SLAM system considers it as being “relocalized.” Here’s an example from our SLAM system. Again we’re using absolute values here, but it could be relative numbers or percentages as well.</p>
<div id="attachment_1332" style="width: 1034px" class="wp-caption aligncenter"><img aria-describedby="caption-attachment-1332" loading="lazy" class="wp-image-1332 size-large" src="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-relocalize-1024x358.png?resize=1024%2C358&#038;ssl=1" alt="" width="1024" height="358" srcset="https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-relocalize.png?resize=1024%2C358&amp;ssl=1 1024w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-relocalize.png?resize=300%2C105&amp;ssl=1 300w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-relocalize.png?resize=768%2C269&amp;ssl=1 768w, https://i0.wp.com/www.kudan.io/wp-content/uploads/2022/07/Minimum-to-relocalize.png?w=1243&amp;ssl=1 1243w" sizes="(max-width: 1000px) 100vw, 1000px" data-recalc-dims="1" /><p id="caption-attachment-1332" class="wp-caption-text"><em>Figure 5: Values for “minimum number of points to relocalize”</em></p></div>
<p>When you set this parameter to a smaller value, the SLAM finds it easier to relocalize. However, selecting a too-small value introduces false relocalization [2].</p>
<p>A rule of thumb is to set it to 1500 and depending on how the SLAM behaves (eg. more loop closures performed or higher experience of false relocalization) you can reduce or increase it accordingly.</p>
<hr />
<h2><strong>Final words</strong></h2>
<p>At Kudan, through our blog, we’ve aimed to distill the information and present it as simply as possible. Here are some of our previous articles on 3D Lidar SLAM:</p>
<ul>
<li><a href="https://www.kudan.io/blog/3d-lidar-slam-the-basics/" target="_blank" rel="noopener">3D lidar SLAM: The Basics</a></li>
<li><a href="https://www.kudan.io/blog/how-to-select-the-best-3d-lidar-for-slam/" target="_blank" rel="noopener">How to Select the Best 3D Lidar for SLAM</a></li>
</ul>
<p>In this article, we’ve presented the goal of tuning 3D-Lidar SLAM parameters, general guidelines, typical values, and their impact on the system&#8217;s performance. The list of parameters to tweak and additional tricks to improve SLAM accuracy and robustness expands wider than what we’ve been able to cover here. If you’re interested to learn more or if you have a question in mind, we invite you to <a href="https://www.kudan.io/contact/" target="_blank" rel="noopener">say hi to us</a> for your specific needs around the technology.</p>
<hr />
<h2><strong>References</strong></h2>
<p>[1] Chen, Yang &amp; Medioni, Gerard. (1992) “Object modeling by registration of multiple range images,” Image and Vision Computing, vol. 10, no. 3, pp. 145–155.[<a href="https://www.sciencedirect.com/science/article/abs/pii/026288569290066C" target="_blank" rel="noopener">PDF</a>].<br />
[2] Vysotska, Olga &amp; Stachniss, Cyrill. (2017). Relocalization under Substantial Appearance Changes using Hashing [<a href="https://www.researchgate.net/profile/Cyrill_Stachniss/publication/344351938_Relocalization_under_Substantial_Appearance_Changes_using_Hashing/links/5f6b4252458515b7cf4747c2/Relocalization-under-Substantial-Appearance-Changes-using-Hashing.pdf" target="_blank" rel="noopener">PDF</a>]</p>
<hr />
<p>■For more details, please contact us from <a href="https://www.kudan.io/contact" target="_blank" rel="noopener noreferrer">here</a>.</p><p>The post <a href="https://www.kudan.io/blog/how-to-tune-3d-lidar-slam-parameters/">How to Tune 3D-Lidar SLAM Parameters</a> first appeared on <a href="https://www.kudan.io">Kudan global</a>.</p>]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1330</post-id>	</item>
	</channel>
</rss>
