Using Big Data for Intelligent Mobile Services w/ Multi-Modal Emotion Recognition

11 Jun , 2015  

This week I spent at ICOST 2015, the 13th International Conference On Smart homes and health Telematics with this years theme on Inclusive Smart Cities and e-Health. The conference takes place from 10th – 12th June 2015 in Geneva, Switzerland. I have the honour to present our paper on the use of Big Data technology in emotion recognition.

Abstract: Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions.In recognition of a subject’s emotions, it is significantly important to be aware of the emotional context. Thanks to the advancement of mobile technology, it is feasible to collect contextual data. In this paper, the authors describe the first step to extract insightful emotional information using cloud-based Big Data infrastructure. Relevant aspects of emotion recognition and challenges that come with multi-modal emotion recognition are also discussed.

Yerzhan Baimbetov, Ismail Khalil, Matthias Steinbauer, Gabriele Anderst-Kotsis

A big thank you to my friend and colleague Yerzhan Baimbetov the main author of this paper who did an actual implementation of the proposed Big Data framework to allow us to prove our ideas.

Slides of my talk at ICOST 2015 can be downloaded here.

The paper is published with Springer.

Comments are closed.