Kinect System Reads Emotions To Help Autistic Children Socialize

Italian researchers have built a Kinect-based system to identify the emotions being felt by a person sitting in front of it.

Reading someone’s emotions without them saying a word may seem like a uniquely human skill, but researchers in Italy are using the motion-tracking capabilities of Microsoft Kinect to do the very same thing.

University of Genoa professor Antonio Camurri and his team have created a system that uses the Kinect’s depth and motion tracking to assess a person’s emotions based on their body movements. Computers have been used to detect emotional states before, reports New Scientist, but that was programs based on facial recognition or voice recordings, whereas the Kinect does not require a person to do anything other than be in the room. The system picks up on their body movements and the software compares it to body positions that are common for many emotions, and then builds a stick figure that reflects the body posture. When compared to humans reading emotions, the system was almost on par – while human volunteers could correctly assess emotions  61.9% of the time, the system was correct 61.3%.

Camurri is using system to create games that help autistic children understand and express emotions.

University of Genoa

Source: New Scientist

Image: FolioVision

Comments

Quantcast