TY - JOUR AU - Cui Dewen AU - Matsufuji Akihiro AU - Liu Yi AU - Eri Sato- Shimokawa AU - Toru Yamaguchi PY - 2022/12/30 Y2 - 2024/03/28 TI - Estimation of Confidence in the Dialogue based on Eye Gaze and Head Movement Information JF - EMITTER International Journal of Engineering Technology JA - EMITTER Int'l J. of Engin. Technol. VL - 10 IS - 2 SE - Articles DO - 10.24003/emitter.v10i2.756 UR - https://emitter.pens.ac.id/index.php/emitter/article/view/756 AB - In human-robot interaction, human mental states in dialogue have attracted attention to human-friendly robots that support educational use. Although estimating mental states using speech and visual information has been conducted, it is still challenging to estimate mental states more precisely in the educational scene. In this paper, we proposed a method to estimate human mental state based on participants’ eye gaze and head movement information. Estimated participants’ confidence levels in their answers to the miscellaneous knowledge question as a human mental state. The participants’ non-verbal information, such as eye gaze and head movements during dialog with a robot, were collected in our experiment using an eye-tracking device. Then we collect participants’ confidence levels and analyze the relationship between human mental state and non-verbal information. Furthermore, we also applied a machine learning technique to estimate participants’ confidence levels from extracted features of gaze and head movement information. As a result, the performance of a machine learning technique using gaze and head movements information achieved over 80 % accuracy in estimating confidence levels. Our research provides insight into developing a human-friendly robot considering human mental states in the dialogue. ER -