Current and New Research Perspectives on Dynamic Facial Emotion Detection in Emotional Interface

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review


In recent years there has been an increasing interdisciplinary exchange between psychology and computer science in the field of recognizing emotions for future-oriented Human-Computer and Human-Machine Interfaces. Although affective computing research has made enormous progress in automatically recognizing facial expressions, it has not yet been fully clarified how algorithms can learn to encode or decode a human face in a real environment. Consequently, our research focuses on the detection of emotions or affective states in a Human-Machine setting. In contrast to other approaches, we use a psychology driven approach trying to minimize complex computations by using a simple dot-based feature extraction method. We suggest a new approach within, but not limited to, a Human-Machine Interface context which detects emotions by analyzing the dynamic change in facial expressions. In order to compare our approach, we discuss our software with respect to other developed facial expression studies in context of its application in a chat environment. Our approach indicates promising results that the program could accurately detect emotions. Implications for further research as well as for applied issues in many areas of Human-Computer Interaction, particularly for affective and social computing, will be discussed and outlined.

Original languageEnglish
Title of host publicationHuman-Computer Interaction : Advanced Interaction Modalities and Techniques - 16th International Conference, HCI International 2014, Proceedings
EditorsMasaaki Kurosu
Number of pages9
Publication date2014
EditionPART 2
ISBN (print)978-3-319-07229-6
ISBN (electronic)978-3-319-07230-2
Publication statusPublished - 2014
Event16th International Conference on Human Computer Interaction - HCI 2014 - Heraklion, Greece
Duration: 22.06.201427.06.2014
Conference number: 16