Abstract:
Autism Spectrum Disorder (ASD) is a kind of developmental disorder, and individuals with ASD have difficulties in face-to-face communication, such as making eye contact and understanding expressions like jokes. In recent years, organizations such as companies have begun to promote personnel diversity in order to improve productivity and revenue. However, organizations that include individuals with ASD face some issues in realizing such positive effects because miscommunications can occur and psychological safety can be threatened. Therefore, in such organizations, it is necessary to observe situations in which both individuals with ASD and those with Typical Development (TD) feel nervous or worried, in order to support communication. According to previous research, it has been reported that the communication difficulties of individuals with ASD are related to understanding external information, and individuals with ASD have unique gaze behaviors that differ from those of individuals with TD. However, most of these studies do not clarify the differences in communication because most of them were conducted mainly for the early diagnosis of ASD using images and short videos. Therefore, in this study, we develop a VR system that allows participants to experience a seminar and investigate differences in gaze behavior between individuals with ASD and those with TD. The VR system presents pre-recorded 360-degree video through a head-mounted display equipped with eye-tracking functionality. The seminar scenario begins with other participants and the moderator entering the room and includes a scene in which the participant introduces themselves to those around them. This setup allows all participants to experience the same immersive communication as a member of the seminar while enabling chronological recording of gaze data. Furthermore, object detection is applied to the 360-degree video presented in the VR system, and by associating the detected objects with the recorded gaze, we compare what each participant looked at and for how long. We prepared two types of seminars, face-to-face and online, with the same content, lines, and characters, and participants experienced both formats while their gaze behavior was recorded. A total of 36 adults participated, including twenty individuals with TD and sixteen individuals with ASD who received formal diagnoses at medical institutions and continue to receive regular care. The results showed that, in the online format, individual differences were large and it was difficult for us to confirm significant comparisons, whereas in the face-to-face format, TD participants spent significantly more time looking at the speaker’s face than ASD participants, and ASD participants spent significantly more time looking at the speaker’s body than TD participants. In addition, in the scene where participants introduced themselves to the surrounding people, TD participants tended to have a wider distribution of gaze and look around the surroundings, while ASD participants tended to focus locally on the moderator’s body.
Type: 17th Asia-Pacific Workshop on Mixed and Augmented Reality (APMAR 2025), Pitch Your Work Presentation Track
Publication date: To be published in Sep 2025