Robust Multimodal Cognitive Load Measurement

Robust Multimodal Cognitive Load Measurement

Chen, Fang
Zhou, Jianlong
Wang, Chao-Yang
Yu, Kun
Arshad, Syed Z.
Khawaji, Ahmad
Conway, Dan

80,07 €(IVA inc.)

This book explores robust multimodal cognitive load measurement with physiological and behavioural modalities, which involve the eye, Galvanic Skin Response, speech, language, pen input, mouse movement and multimodality fusions. Factors including stress, trust, and environmental factors such as illumination are discussed regarding their implications for cognitive load measurement. Furthermore, dynamic workload adjustment and real-time cognitive load measurement with data streaming are presented in order to make cognitive load measurement accessible by more widespread applications and users. Finally, application examples are reviewed demonstrating the feasibility of multimodal cognitive load measurement in practical applications.

This is the first book of its kind to systematically introduce various computational methods for automatic and real-time cognitive load measurement and by doing so moves the practical application of cognitive load measurement from the domain of the computer scientist and psychologist to more general end-users, ready for widespread implementation.

Robust Multimodal Cognitive Load Measurement is intended for researchers and practitioners involved with cognitive load studies and communities within the computer, cognitive, and social sciences. The book will especially benefit researchers in areas like behaviour analysis, social analytics, human-computer interaction (HCI), intelligent information processing, and decision support systems.

  • ISBN: 978-3-319-31698-7
  • Editorial: Springer
  • Encuadernacion: Cartoné
  • Fecha Publicación: 08/08/2016
  • Nº Volúmenes: 1
  • Idioma: Inglés