TY - GEN
T1 - Autonomous Mobile Robot Navigation Using 2D LiDAR and Inclined Laser Rangefinder to Avoid a Lower Object
AU - Ren Yee, Phang Darren
AU - Pinrath, Nattawat
AU - Matsuhira, Nobuto
N1 - Publisher Copyright:
© 2020 The Society of Instrument and Control Engineers - SICE.
PY - 2020/9/23
Y1 - 2020/9/23
N2 - In this study, we propose a collaborative autonomous navigation stack between two sensors: a laser rangefinder (LRF) and a 2D light detection and ranging (LiDAR). The difference between LRF and LiDAR is that LiDAR uses the same laser technology but rotates around its axis and offers a 360° all-round visibility. The system can navigate in a more complex environment such as a convenience store or a supermarket. Because of the collaboration between inclined LRFs and 2D LiDARs, a mobile robot can avoid objects below the scanning sight of a 2D LiDAR, for example, a small box, a short display rack, shelf legs, and the lower body of a shopping cart. Thus, the proposed system leverages the navigation stack capabilities of using multiple observation sources to increase the accuracy of both navigation and obstacle avoidance. Our method aims to solve the issue of obstacle avoidance of objects below the scanning sight of a 2D LiDAR and increase the accuracy of detection during navigation using an inclined LRF. The proposed system is based on a robot operating system. Experiments were conducted to demonstrate outputs in terms of increasing navigation and detection accuracy by being able to effectively avoid objects below a 2D LiDAR.
AB - In this study, we propose a collaborative autonomous navigation stack between two sensors: a laser rangefinder (LRF) and a 2D light detection and ranging (LiDAR). The difference between LRF and LiDAR is that LiDAR uses the same laser technology but rotates around its axis and offers a 360° all-round visibility. The system can navigate in a more complex environment such as a convenience store or a supermarket. Because of the collaboration between inclined LRFs and 2D LiDARs, a mobile robot can avoid objects below the scanning sight of a 2D LiDAR, for example, a small box, a short display rack, shelf legs, and the lower body of a shopping cart. Thus, the proposed system leverages the navigation stack capabilities of using multiple observation sources to increase the accuracy of both navigation and obstacle avoidance. Our method aims to solve the issue of obstacle avoidance of objects below the scanning sight of a 2D LiDAR and increase the accuracy of detection during navigation using an inclined LRF. The proposed system is based on a robot operating system. Experiments were conducted to demonstrate outputs in terms of increasing navigation and detection accuracy by being able to effectively avoid objects below a 2D LiDAR.
KW - LiDAR
KW - ROS
KW - autonomous mobile robot
KW - laser range finder
KW - navigation
UR - http://www.scopus.com/inward/record.url?scp=85096361796&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85096361796&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85096361796
T3 - 2020 59th Annual Conference of the Society of Instrument and Control Engineers of Japan, SICE 2020
SP - 1404
EP - 1409
BT - 2020 59th Annual Conference of the Society of Instrument and Control Engineers of Japan, SICE 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 59th Annual Conference of the Society of Instrument and Control Engineers of Japan, SICE 2020
Y2 - 23 September 2020 through 26 September 2020
ER -