This book explores robust multimodal cognitive load measurement with physiological and behavioural modalities, which involve the eye, Galvanic Skin Response, speech, language, pen input, mouse movement and multimodality fusions. Factors including stress, trust, and environmental factors such as illumination are discussed regarding their implications for cognitive load measurement. Furthermore, dynamic workload adjustment and real-time cognitive load measurement with data streaming are presented in order to make cognitive load measurement accessible by more widespread applications and users. Finally, application examples are reviewed demonstrating the feasibility of multimodal cognitive load measurement in practical applications.
This is the first book of its kind to systematically introduce various computational methods for automatic and real-time cognitive load measurement and by doing so moves the practical application of cognitive load measurement from the domain of the computerscientist and psychologist to more general end-users, ready for widespread implementation.
Robust Multimodal Cognitive Load Measurement is intended for researchers and practitioners involved with cognitive load studies and communities within the computer, cognitive, and social sciences. The book will especially benefit researchers in areas like behaviour analysis, social analytics, human-computer interaction (HCI), intelligent information processing, and decision support systems.
ISBN: | 9783319810997 |
Publication date: | 31st May 2018 |
Author: | Fang Chen, Jianlong Zhou, Yang Wang, Kun Yu, Syed Z Arshad, Ahmad Khawaji, Dan Conway |
Publisher: | Springer an imprint of Springer International Publishing |
Format: | Paperback |
Pagination: | 254 pages |
Series: | Human-Computer Interaction Series |
Genres: |
Human–computer interaction Pattern recognition Physiological and neuro-psychology, biopsychology |