Leveraging this somatic lexicon, researchers at the University of Chapel Hill and the University of Maryland recently investigated a machine learning method that can identify a person’s perceived emotion, valence (e.g., negative or positive), and arousal (calm or energetic) from their gait alone.

The researchers claim this approach — which they believe is the first of its kind — achieved 80.07% percent accuracy in preliminary experiments.

“Emotions play a large role in our lives, defining our experiences and shaping how we view the world and interact with other humans,” wrote the coauthors.

“Because of the importance of perceived emotion in everyday life, automatic emotion recognition is a critical problem in many fields, such as games and entertainment, security and law enforcement, shopping, human-computer interaction, and human-robot interaction.”

The researchers selected four emotions — happy, sad, angry, and neutral — for their tendency to “last an extended period” and their “abundance” in walking activity.

Then they extracted gaits from multiple walking video corpora to identify affective features and extracted poses using a 3D pose estimation technique.

The text above is a summary, you can read full article here.