Evaluation of commonsense knowledge for intuitive robotic service

Trung Ngo Lam, Haeyeon Lee, Katsuhiro Mayama, Makoto Mizukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Human commonsense is required to improve quality of robotic application. However, to acquire the necessary knowledge, robot needs to evaluate the appropriateness of the data it has collected. This paper presents an evaluation method, by combining the weighting mechanism in commonsense databases with a set of weighting factors. The method was verified on our Basic-level Knowledge Network. We conducted questionnaire to collect a commonsense data set and estimate weighting factors. Result showed that, the proposed method was able to build Robot Technology (RT) Ontology for a smart "Bring something" robotic service. More importantly, it allowed robot to learn new knowledge when necessary. An intuitive human-robot interface application was developed as an example base on our approach.

Original languageEnglish
Title of host publication2012 IEEE International Conference on Robotics and Automation, ICRA 2012
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3679-3684
Number of pages6
ISBN (Print)9781467314039
DOIs
Publication statusPublished - 2012
Event 2012 IEEE International Conference on Robotics and Automation, ICRA 2012 - Saint Paul, MN, United States
Duration: 2012 May 142012 May 18

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Conference

Conference 2012 IEEE International Conference on Robotics and Automation, ICRA 2012
Country/TerritoryUnited States
CitySaint Paul, MN
Period12/5/1412/5/18

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Evaluation of commonsense knowledge for intuitive robotic service'. Together they form a unique fingerprint.

Cite this