Towards Affect Recognition through Interactions with Learning Materials

Within the frame of H2020 MaTHiSiS EU project, the paper titled "Towards Affect Recognition through Interactions with Learning Materials" by E. Ghaleb, E. Hortal, M. Popa, S. Asteriadis, G. Weiss, has been accepted as a regular paper for oral presentation at  the 17th IEEE International Conference on Machine Learning and Applications (ICMLA’18). The conference will take place in Orlando, Florida, 17-20 December, 2018.

 

 

High-performance and Lightweight Real-time Deep Face Emotion Recognition

The paper titled “ High-performance and Lightweight Real-time Deep Face Emotion Recognition ” (co-authored by Justus Schwan, Esam Ghaleb, Enrique Hortal and Stylianos Asteriadis) has been accepted for publication at the SMAP2017 – 12th International Workshop on Semantic and Social Media Adaptation and Personalization. This paper will be included as part of the Special Session on Multimodal affective analysis for human-machine interfaces and learning environments.

Recognizing Emotional States using Speech Information

Emotion recognition plays an important role in several applications, such as human computer interaction and understanding afective state of users in certain tasks, e.g., within a learning process, monitoring of elderly, interactive entertainment etc. It may be based upon several modalities, e.g., by analyzing facial expressions and/or speech, using electroencephalograms, electrocardiograms etc. In certain applications the only available modality is the user's (speaker's) voice.