Development of an Autonomous Mobile Robot with Self-Localization and Searching Target in a real Environment

Masatoshi Nomatsu, Youhei Suganuma, Yosuke Yui, Yutaka Uchimura

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)


In describing real-world self-localization and targetsearch methods, this paper discusses a mobile robot developed to verify a method proposed in Tsukuba Challenge 2014. The Tsukaba Challenge course includes promenades and parks containing ordinary pedestrians and bicyclists that require the robot to move toward a goal while avoiding the moving objects around it. Common self-localization methods often include 2D laser range finders (LRFs), but such LRFs do not always capture enough data for localization if, for example, the scanned plane has few landmarks. To solve this problem, we used a three-dimensional (3D) LRF for self-localization. The 3D LRF captures more data than the 2D type, resulting in more robust localization. Robots that provide practical services in real life must, among other functions, recognize a target and serve it autonomously. To enable robots to do so, this paper describes a method for searching for a target by using a cluster point cloud from the 3D LRF together with image processing of colored images captured by cameras. In Tsukuba Challenge 2014, the robot we developed providing the proposed methods completed the course and found the targets, verifying the effectiveness of our proposals.

Original languageEnglish
Pages (from-to)356-364
Number of pages9
JournalJournal of Robotics and Mechatronics
Issue number4
Publication statusPublished - 2015


  • 3D laser range finder
  • Autonomous mobile robot
  • Map matching method
  • Point cloud processing
  • Self localization

ASJC Scopus subject areas

  • Computer Science(all)
  • Electrical and Electronic Engineering


Dive into the research topics of 'Development of an Autonomous Mobile Robot with Self-Localization and Searching Target in a real Environment'. Together they form a unique fingerprint.

Cite this