Researchers have developed a emotion recognition system that will adapt to human's emotional states.

What difference would it make if automated phone calls or services could tell what emotional state we were in? How would it respond if it was aware of the fact that we were clearly getting angry at its pre-programmed questions and endless updates? According to a group of researchers at Spain's Universidad Carlos III de Madrid (UC3M) and Universidad de Granada, it could actually make phone calls shorter, more successful and most importantly less stressful.

The system infers emotional states of human emotion based on vocal data. It calculates tone of voice as well as speech patterns to deduce what the emotional state of an individual is and makes decisions on how to proceed based on that information. The system will change its behavior, for example, what questions it will ask, how it asks questions and where it leads people according to the calculated emotions of the person on the other line. Besides reading emotion from this data, using a statistical method it interprets the intentions of the speaker as to whether or not they will continue carrying on the conversation.

$15 provides access to this article and every case-study, interview, and analysis piece that we publish for the next 30 days. Our Premium Subscription also provides access to a database of over 100,000 articles on innovation in brand, customer, and retail experience.
Already a subscriber? Log in