Cues from Facial Expressions for Emotional Interfaces

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Authors

Emotion detection provides a promising basis for designing future-oriented human centered Human-Machine Interfaces. Affective Computing can facilitate human-machine communication. Such adaptive advanced driver assistance systems (ADAS) which are dependent on the emotional state of the driver can be applied in cars. The following pilot study evaluated automatic recognition of emotions using facial expressions with N = 1 subjects. In contrast to the majority of earlier studies that only used complex and static recognition methods, a new non-complex dynamic approach for detecting emotions in facial expressions directly in a driving context is proposed. By analysing the changes within an area, defined by a number of dots that were arranged on participants‟ faces, variables were extracted to classify the participants‟ emotions. A special pattern-recognition algorithm detects the dots according to the Facial Action Coding System. The results of our novel way to categorize emotions lead to a discussion on additional applications and limitations that frames an attempted approach of emotion detection in cars. Implications for further research and applications are outlined
Original languageEnglish
Title of host publicationHuman Centred Automation : HFES Europe Chapter
EditorsDick De Waard, Nina Gérard, Linda Onnasch, Rebecca Wiczorek, Dietrich Manzey
Number of pages12
PublisherShaker Publishing
Publication date20.06.2011
Pages111-122
ISBN (print)978-90-423-0406-2
Publication statusPublished - 20.06.2011
EventHuman Factors and Ergonomics Society Europe Chapter Annual Conference - 2010: Human Centered Automation - Berlin, Germany
Duration: 13.10.201015.10.2010
https://www.hfes.org/events/national-ergonomics-month/past-events