Google Glass App Reads Human Emotions and Feelings
Face-tracking identifies moods and feelings via facial expressions, and converts them into a report.
San Diego startup’s Emotient app, will allow Google Glass cameras will have the ability to read people’s emotions, in real-time.
The technology captures human sentiment by processing facial expressions and providing an aggregate emotional analysis, measured in metrics as positive, negative and neutral, at the most basic level. More advanced feelings captured include joy, surprise, sadness, fear, disgust, contempt and anger; even deeper into human emotional layers, the software can detect frustration and confusion.
The software’s core planned use for industry are slated for retail. Emotient can be utilized to understand reactions to customer service and marketing tactics, used as a suggestive tool for consumers who might have conflicting buying intentions, be monitoring the emotions on display while debating between various products, yearning to indulge, or considering an advertisement.