| アイテムタイプ |
学術雑誌論文 / Journal Article(1) |
| 公開日 |
2024-01-26 |
| タイトル |
|
|
タイトル |
Eye-movement analysis on facial expression for identifying children and adults with neurodevelopmental disorders |
| 言語 |
|
|
言語 |
eng |
| キーワード |
|
|
主題Scheme |
Other |
|
主題 |
autism spectrum disorder |
| キーワード |
|
|
主題Scheme |
Other |
|
主題 |
schizophrenia |
| キーワード |
|
|
主題Scheme |
Other |
|
主題 |
convolutional neural networks |
| キーワード |
|
|
主題Scheme |
Other |
|
主題 |
eye movement |
| キーワード |
|
|
主題Scheme |
Other |
|
主題 |
facial emotion recognition |
| 資源タイプ |
|
|
資源タイプ |
journal article |
| アクセス権 |
|
|
アクセス権 |
open access |
| 著者 |
Iwauchi, Kota
田中, 宏季
Okazaki, Kosuke
Matsuda, Yasuhiro
Uratani, Mitsuhiro
Morimoto, Tsubasa
中村, 哲
|
| 抄録 |
|
|
内容記述タイプ |
Abstract |
|
内容記述 |
Experienced psychiatrists identify people with autism spectrum disorder (ASD) and schizophrenia (Sz) through interviews based on diagnostic criteria, their responses, and various neuropsychological tests. To improve the clinical diagnosis of neurodevelopmental disorders such as ASD and Sz, the discovery of disorder-specific biomarkers and behavioral indicators with sufficient sensitivity is important. In recent years, studies have been conducted using machine learning to make more accurate predictions. Among various indicators, eye movement, which can be easily obtained, has attracted much attention and various studies have been conducted for ASD and Sz. Eye movement specificity during facial expression recognition has been studied extensively in the past, but modeling taking into account differences in specificity among facial expressions has not been conducted. In this paper, we propose a method to detect ASD or Sz from eye movement during the Facial Emotion Identification Test (FEIT) while considering differences in eye movement due to the facial expressions presented. We also confirm that weighting using the differences improves classification accuracy. Our data set sample consisted of 15 adults with ASD and Sz, 16 controls, and 15 children with ASD and 17 controls. Random forest was used to weight each test and classify the participants as control, ASD, or Sz. The most successful approach used heat maps and convolutional neural networks (CNN) for eye retention. This method classified Sz in adults with 64.5% accuracy, ASD in adults with up to 71.0% accuracy, and ASD in children with 66.7% accuracy. Classifying of ASD result was significantly different (p<.05) by the binomial test with chance rate. The results show a 10% and 16.7% improvement in accuracy, respectively, compared to a model that does not take facial expressions into account. In ASD, this indicates that modeling is effective, which weights the output of each image. |
| 書誌情報 |
en : Frontiers in Digital Health
巻 5,
発行日 2023-01-16
|
| 出版者 |
|
|
出版者 |
Frontiers Media |
| ISSN |
|
|
収録物識別子タイプ |
EISSN |
|
収録物識別子 |
2673-253X |
| 出版者版DOI |
|
|
関連タイプ |
isReplacedBy |
|
|
識別子タイプ |
DOI |
|
|
関連識別子 |
https://doi.org/10.3389/fdgth.2023.952433 |
| 出版者版URI |
|
|
関連タイプ |
isReplacedBy |
|
|
識別子タイプ |
URI |
|
|
関連識別子 |
https://www.frontiersin.org/articles/10.3389/fdgth.2023.952433 |
| 権利 |
|
|
権利情報Resource |
http://creativecommons.org/licenses/by/4.0/ |
|
権利情報 |
c 2023 Iwauchi, Tanaka, Okazaki, Matsuda, Uratani, Morimoto and Nakamura. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
| 著者版フラグ |
|
|
出版タイプ |
NA |