Multimodal Affect Detection from Physiological and Facial Features during ITS Interaction


Md. Sazzad Hussain and Rafael A. Calvo

Paper type: 


Multimodal approaches are increasingly used for affect detection. This paper proposes a model for the fusion of physiological signal that measure learners’ heart activity and their facial expressions to detect learners’ affective states while students interact with an Intelligent Tutoring System (ITS). It studies machine learning and fusion techniques that classify the system’s automated feedback from the individual channels and their feature level fusion. It also evaluates the classification performance of fusion models in multimodal systems, identifying the effects of fusion over the individual modalities.


Affective computing, multimodality, AutoTutor, feedback, learning interaction, fusion