Combining oculo-motor indices to measure cognitive load of synthetic speech in noisy listening conditions

Mateusz Dubiel, Minoru Nakayama, Xin Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

18 Downloads (Pure)


Gaze-based assistive technologies (ATs) that feature speech have the potential to improve the life of people with communication disorders. However, due to a limited understanding of how different speech types affect the cognitive load of users, an evaluation of ATs remains a challenge. Expanding on previous work, we combined temporal changes in pupil size and ocular movements (saccades and fixation differentials) to evaluate cognitive workload of two types of speech (natural and synthetic) mixed with noise, through a listening test. While observed pupil sizes were significantly larger at lower signal-to-noise levels, as participants listened and memorised speech stimuli; saccadic eye-movements were significantly more frequent for synthetic speech. In the synthetic condition, there was a strong negative correlation between pupil dilation and fixation differentials, indicating a higher strain on participants’ cognitive resources. These results suggest that combining oculo-motor indices can aid our understanding of the cognitive implications of different speech types.
Original languageEnglish
Title of host publicationETRA '21 Short Papers : ACM Symposium on Eye Tracking Research and Applications
EditorsStephen N. Spencer
Place of PublicationNew York, NY.
Number of pages6
ISBN (Electronic)9781450383455
Publication statusPublished - 25 May 2021
Event2021 ACM Symposium on Eye Tracking Research and Applications - Virtual , Stuttgart , Germany
Duration: 24 Apr 202127 Apr 2021

Publication series

NameEye Tracking Research and Applications Symposium (ETRA)


Conference2021 ACM Symposium on Eye Tracking Research and Applications
Abbreviated titleETRA '21
Internet address


  • speech perception
  • synthetic speech
  • eye-tracking
  • evaluation study

Cite this