K. Karpouzis, A. Drosopoulos, S. Ioannou, A. Raouzaiou, N. Tsapatsoulis, S. Kollias |
Facial and Body Feature Extraction for Emotionally-Rich HCI |
Mehdi Khosrow-Pour (ed.), Encyclopedia of Information Science and Technology, Idea Group Publishers, pp. 1180 - 1185, 2005 |
ABSTRACT
|
Emotionally-aware Man-Machine Interaction (MMI) systems are presently at the forefront of interest of the computer vision and artificial intelligence communities, since they give the opportunity to less technology-aware people to use computers more efficiently, overcoming fears and preconceptions. Most emotion-related facial and body gestures are considered to be universal, in the sense that they are recognized along different cultures; therefore, the introduction of an “emotional dictionary” that includes descriptions and perceived meanings of facial expressions and body gestures, so as to help infer the likely emotional state of a specific user, can enhance the affective nature of MMI applications. As a general rule, our intuition of what a human expression represents is based on trying to mimic the way the human mind works while making an effort to recognize such an emotion. This means that even though image or video input are necessary to this task, this process cannot come to robust results without taking into account features like hand gestures or body pose. These features are able to convey messages in a much more expressive and definite manner than mere wording, which can be misleading or ambiguous. Sometimes, a simple hand action, such as placing onesą hands over their ears, can pass on the message that they've had enough of what they are hearing more expressively than any spoken phrase.
|
07 January , 2005 |
K. Karpouzis, A. Drosopoulos, S. Ioannou, A. Raouzaiou, N. Tsapatsoulis, S. Kollias, "Facial and Body Feature Extraction for Emotionally-Rich HCI", Mehdi Khosrow-Pour (ed.), Encyclopedia of Information Science and Technology, Idea Group Publishers, pp. 1180 - 1185, 2005 |
[ PDF] [
BibTex] [
Print] [
Back] |