講演抄録/キーワード |
講演名 |
2022-12-23 14:00
Estimate students' concentration level by using facial expression ○Guan-yun Wang・Hikaru Nagata・Yasuhiro Hatori・Yoshiyuki Sato・Chia-huei Tseng・Satoshi Shioiri(Tohoku Univ.) HIP2022-70 |
抄録 |
(和) |
(まだ登録されていません) |
(英) |
Concentration and learning performance of all the students are difficult to track throughout courses. The study recruited 13 participants and asked them to solve a problem used in the International Olympiad of Linguistics in 2018 using a website designed for the task. Participants’ face videos were recorded while they were solving the problems. Action Unit codes (AU), which are facial features related to expressions were extracted with an open-source software, Openface. One of authors evaluated the recorded face to classify the participants into “strongly engaged” group and “weakly engaged” group for the first attepmt. We used lightGBM to train a model to classify the participants into the two groups using AUs extracted by Openface. The classification accuracy of testing data evaluated by five validation method was 95.1%. The intensity of AU04, AU17 and AU25 are the best three features to contribute to the classification. They are “brow lowerer”, “chin raiser” and “lips part”, which suggest that facial features of eyes, cheek and mouth are important to estimate engagement levels. The present study further analyze the mental states of the participants, the classification accuracy was 73.9% using the same classify method as mentioned. Several feature sets for classifier training were discussed in the present study. |
キーワード |
(和) |
/ / / / / / / |
(英) |
machine learning / facial expression / concentration / classification / action units codes / / / |
文献情報 |
信学技報, vol. 122, no. 326, HIP2022-70, pp. 65-69, 2022年12月. |
資料番号 |
HIP2022-70 |
発行日 |
2022-12-15 (HIP) |
ISSN |
Online edition: ISSN 2432-6380 |
著作権に ついて |
技術研究報告に掲載された論文の著作権は電子情報通信学会に帰属します.(許諾番号:10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
PDFダウンロード |
HIP2022-70 |
|