{"id":347,"date":"2019-06-25T00:00:26","date_gmt":"2019-06-25T00:00:26","guid":{"rendered":"https:\/\/www.kudan.io\/jp\/?p=347"},"modified":"2020-07-28T22:24:24","modified_gmt":"2020-07-28T22:24:24","slug":"slam%e3%81%ae%e3%81%9f%e3%82%81%e3%81%ae%e3%82%ab%e3%83%a1%e3%83%a9%e3%82%ad%e3%83%a3%e3%83%aa%e3%83%96%e3%83%ac%e3%83%bc%e3%82%b7%e3%83%a7%e3%83%b3%ef%bc%881-3%ef%bc%89","status":"publish","type":"post","link":"https:\/\/www.kudan.io\/jp\/archives\/347","title":{"rendered":"SLAM\u306e\u305f\u3081\u306e\u30ab\u30e1\u30e9\u30ad\u30e3\u30ea\u30d6\u30ec\u30fc\u30b7\u30e7\u30f3\uff081\/3\uff09"},"content":{"rendered":"<p>Kudan\u304c\u63d0\u4f9b\u3059\u308bKudanSLAM\u306b\u4ee3\u8868\u3055\u308c\u308bSLAM (Simultaneous Localisation and Mapping)\u6280\u8853\u306e\u305f\u3081\u306e\u30ab\u30e1\u30e9\u30ad\u30e3\u30ea\u30d6\u30ec\u30fc\u30b7\u30e7\u30f3\u306b\u3064\u3044\u3066\u6982\u8aac\u3057\u307e\u3059\u3002<\/p>\n<hr \/>\n<p>As we have covered before, SLAM systems use one or more cameras embedded on a device to simultaneously localise the device\u2019s position and orientation whilst also mapping the environment. Before we deploy a SLAM system, it is crucial that we calibrate the camera to account for its internal properties and other external factors that can affect the images generated. In order to understand our camera calibration process at Kudan, we must first go over the properties of a camera.<\/p>\n<p><strong>Properties of a camera<\/strong><\/p>\n<p>Two different cameras at the same position and orientation can generate two different images. This is because the images generated depend on various properties of the camera we refer to as \u2018intrinsic\u2019 parameters. These include properties inherent to the camera such as the size of the camera\/aperture, the focal length of its lens, the distortion caused by its lens and so on. Due to this, for single camera setups we must know and account for the camera\u2019s intrinsic parameters in order for our SLAM algorithms to work.<\/p>\n<div id=\"attachment_332\" style=\"width: 466px\" class=\"wp-caption aligncenter\"><img aria-describedby=\"caption-attachment-332\" loading=\"lazy\" class=\"wp-image-332 size-full\" src=\"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-1.PNG.png\" alt=\"\" width=\"456\" height=\"339\" srcset=\"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-1.PNG.png 456w, https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-1.PNG-300x223.png 300w\" sizes=\"(max-width: 456px) 100vw, 456px\" \/><p id=\"caption-attachment-332\" class=\"wp-caption-text\">Stereo camera rig<\/p><\/div>\n<p>Now for setups which include two cameras, we not only need to know each of the cameras\u2019 internal properties, but also their \u2018extrinsic\u2019 parameters. Extrinsic parameters refer to the position and orientation of both the cameras (how far apart the cameras are and what angle they are facing relative to each other).<\/p>\n<p><strong>The calibration process <\/strong><\/p>\n<p>Not all the parameters can be measured by physically analysing the camera, especially intrinsic parameters such as distortion, which is why software libraries are often used to calculate these parameters by analysing the camera\u2019s video feed. There are many software libraries to calibrate cameras, some are listed at the end of the article.<\/p>\n<p><img loading=\"lazy\" class=\"alignnone size-full wp-image-333\" src=\"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/Screenshot-2019-06-25-at-11.58.43.png\" alt=\"\" width=\"668\" height=\"444\" srcset=\"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/Screenshot-2019-06-25-at-11.58.43.png 668w, https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/Screenshot-2019-06-25-at-11.58.43-300x199.png 300w\" sizes=\"(max-width: 668px) 100vw, 668px\" \/><\/p>\n<p>We begin the calibration process by showing a known object to the camera, such as a chessboard that we know the physical properties of ( the dimensions and the number of black and white squares in the case of the chessboard). These properties of the object are also stored in the software. We then move the object in front of the camera. As the exact physical properties of the object are stored in the software, it can automatically calculate the intrinsic and extrinsic parameters of the cameras by observing how the image distorts as the object is moved.<\/p>\n<p><img loading=\"lazy\" class=\"aligncenter wp-image-330 size-full\" src=\"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-3.PNG.png\" alt=\"\" width=\"604\" height=\"224\" srcset=\"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-3.PNG.png 604w, https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-3.PNG-300x111.png 300w, https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/07\/calibration-pic-3.PNG-600x224.png 600w\" sizes=\"(max-width: 604px) 100vw, 604px\" \/>\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 Camera 1\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0Camera 2<\/p>\n<p>As we can see, images can be severely distorted by the camera and straight lines can appear curved. It is therefore very important to calibrate the cameras before deployment and by having a robust calibration process, we can ensure that our algorithms are always performing at their best with different camera setups.<\/p>\n<p>&nbsp;<\/p>\n<p><em>Next\uff1a<a href=\"https:\/\/www.kudan.io\/jp\/archives\/344\">SLAM\u306e\u305f\u3081\u306e\u30ab\u30e1\u30e9\u30ad\u30e3\u30ea\u30d6\u30ec\u30fc\u30b7\u30e7\u30f3\uff082\/3\uff09<\/a><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Kudan\u304c\u63d0\u4f9b\u3059\u308bKudanSLAM\u306b\u4ee3\u8868\u3055\u308c\u308bSLAM (Simultaneous Localisation and Mapping)\u6280\u8853\u306e\u305f\u3081\u306e\u30ab\u30e1\u30e9\u30ad\u30e3\u30ea\u30d6\u30ec\u30fc\u30b7\u30e7\u30f3\u306b\u3064\u3044\u3066\u6982\u8aac\u3057\u307e\u3059\u3002 As we have c [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":242,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spay_email":""},"categories":[3],"tags":[],"acf":[],"aioseo_notices":[],"jetpack_featured_media_url":"https:\/\/www.kudan.io\/jp\/wp-content\/uploads\/sites\/3\/2020\/06\/blog.jpg","_links":{"self":[{"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/posts\/347"}],"collection":[{"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/comments?post=347"}],"version-history":[{"count":3,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/posts\/347\/revisions"}],"predecessor-version":[{"id":422,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/posts\/347\/revisions\/422"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/media\/242"}],"wp:attachment":[{"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/media?parent=347"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/categories?post=347"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kudan.io\/jp\/wp-json\/wp\/v2\/tags?post=347"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}