TY - GEN
T1 - An Impression Evaluation of Robot Facial Expressions Considering Individual Differences by Using Biological Information
AU - Yu, Kai
AU - Anuardi, Muhammad Nur Adilin Mohd
AU - Sripian, Peeraya
AU - Sugaya, Midori
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - The application of the robot services has been in high demand due to the declining birthrate and aging society. However, one robot’s expression could have different perceptions. Therefore, the purpose of this study is to improve the robot’s impression by estimating human emotions using biological information. Here, a machine learning method was proposed to consider individual differences from the impression evaluation and the combined measurements of electroencephalography (EEG) and pulse rate. This method was evaluated based on three patterns of robot expressions: same as human emotion (synchronize), against human emotion (reverse synchronize), and funny facial expressions. A machine learning method was implied to create a classification model to decide the facial expression of a robot that suits users’ preference. As a result, the individual differences were observed and the machine learning approach reached 80% accuracy.
AB - The application of the robot services has been in high demand due to the declining birthrate and aging society. However, one robot’s expression could have different perceptions. Therefore, the purpose of this study is to improve the robot’s impression by estimating human emotions using biological information. Here, a machine learning method was proposed to consider individual differences from the impression evaluation and the combined measurements of electroencephalography (EEG) and pulse rate. This method was evaluated based on three patterns of robot expressions: same as human emotion (synchronize), against human emotion (reverse synchronize), and funny facial expressions. A machine learning method was implied to create a classification model to decide the facial expression of a robot that suits users’ preference. As a result, the individual differences were observed and the machine learning approach reached 80% accuracy.
KW - Biological information
KW - Emotion
KW - Robot expression
UR - http://www.scopus.com/inward/record.url?scp=85112141479&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85112141479&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-80094-9_61
DO - 10.1007/978-3-030-80094-9_61
M3 - Conference contribution
AN - SCOPUS:85112141479
SN - 9783030800932
T3 - Lecture Notes in Networks and Systems
SP - 514
EP - 521
BT - Advances in Creativity, Innovation, Entrepreneurship and Communication of Design - Proceedings of the AHFE 2021 Virtual Conferences on Creativity, Innovation and Entrepreneurship, and Human Factors in Communication of Design, 2021
A2 - Markopoulos, Evangelos
A2 - Goonetilleke, Ravindra S.
A2 - Ho, Amic G.
A2 - Luximon, Yan
PB - Springer Science and Business Media Deutschland GmbH
T2 - AHFE Conferences on Creativity, Innovation and Entrepreneurship, and Human Factors in Communication of Design, 2021
Y2 - 25 July 2021 through 29 July 2021
ER -