TY - GEN
T1 - A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals
AU - Kurono, Yuya
AU - Sripian, Peeraya
AU - Chen, Feng
AU - Sugaya, Midori
N1 - Publisher Copyright:
© 2019, Springer Nature Switzerland AG.
PY - 2019
Y1 - 2019
N2 - Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.
AB - Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.
KW - Biological signals
KW - Emotion classification
KW - Facial expression
KW - Feeling
KW - Robotics
KW - Sympathy
UR - http://www.scopus.com/inward/record.url?scp=85069749731&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85069749731&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-22643-5_10
DO - 10.1007/978-3-030-22643-5_10
M3 - Conference contribution
AN - SCOPUS:85069749731
SN - 9783030226428
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 133
EP - 142
BT - Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings
A2 - Kurosu, Masaaki
PB - Springer Verlag
T2 - Thematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019
Y2 - 26 July 2019 through 31 July 2019
ER -