Home | Sitemap | Contact | 中文 | CAS
Search: 
About Us
Research
Scientists
International Cooperation
News
Education & Training
Join Us
Societies & Publications
Papers
Resources
Links
Location:Home>Research>Research Progress
 
Gait Recorded by Smart Phone can Reveal Your Emotion
 
Author: Prof.ZHU Tingshao's Research Group      Update time: 2015/11/12
Close
Text Size: A A A
Print

Most researches have been conducted on nonverbal signals, which are considered as extrinsic expression of human’s intrapsychic state. Emotion detection aims to determine a person's affective state automatically, which has immense potential in many areas, including health care, psychological detection and human-computer interaction, etc. Traditional emotion detection is based on expressions, or linguistic and acoustic features in speech. However, high complexity in dealing with image and audio is inevitable.

 

Dr. ZHU Tingshao’s Computational CyberPsychology Lab. (CCPL) from the Institute of Psychology, Chinese Academy of Sciences has proposed a novel method for identifying human emotion from natural walking. Results indicate that it is possible to identify emotion using gait data. Besides, ankle is more capable to reveal human emotion (angry/neutral/happy) than wrist.

 

Linear acceleration and gravity data were recorded by smart phone embedded acceleration sensors. Two rounds of experiments were conducted as follows: in a fixed rectangle-shaped area marked on the floor with red lines, after signed the consent form, each participant wore one smart phone on one wrist and the other on one ankle, and stand in front of the starting line. Once the participant was ready, he/she was asked to walk naturally back and forth in the area for about two minutes. Then, researchers asked him/her to report his/her current emotion state (anger) with a score from 1 to 10. The participant watched film clips for emotion priming. After watching, the participant was asked to walk naturally back and forth again in the same area for another one minute, just as before. Each participant was asked to report his/her current anger score and recall the anger score after watching film clips. For the second-round, happy score was acquired, and happy clips were used.

 

In the experiment, all trained models (Random Tree, Random Forest, Support Vector Machine, Multilayer perception, Decision Tree) showed that there exists great difference in gait before and after watching film clip. The emotion identification accuracy on data sets from ankle is higher than that from wrist. Among above models, Support Vector Machine works the best. The accuracy for identifying angry vs. neutral is 90.31%, and identifying happy vs. neutral is 89.76%. The accuracy for identifying anger, neutral, and happy are 85%, 78%, and78%.

 

This study was supported by National High-tech R&D Program of China (2013AA01A606), National Basic Research Program of China (2014CB744600), Key Research Program of Chinese Academy of Sciences (CAS)(KJZD-EWL04), and CAS Strategic Priority Research Program (XDA06030800).

 

Liqing cui, Shun Li, Tingshao Zhu. Emotion Detection from Natural Walking. Human-Centered Computing (HCC) 2016, Jan. 2016, Moratuwa Sri Lanka. (Accepted)

16 Lincui Road, Chaoyang District, Beijing 100101, China. All Rights Reserved