Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots

Teruko Yata, Akihisa Ohya, Shin'ichi Yuta

研究成果査読

12 被引用数 (Scopus)

抄録

This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omni-directional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.

本文言語English
ページ(範囲)3925-3930
ページ数6
ジャーナルProceedings - IEEE International Conference on Robotics and Automation
4
出版ステータスPublished - 2000 12月 3
イベントICRA 2000: IEEE International Conference on Robotics and Automation - San Francisco, CA, USA
継続期間: 2000 4月 242000 4月 28

ASJC Scopus subject areas

  • ソフトウェア
  • 制御およびシステム工学
  • 人工知能
  • 電子工学および電気工学

フィンガープリント

「Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル