AI Mask Amplifies Human Emotions In Real Time

AI Mask Amplifies Human Emotions In Real Time
Design

Hyperface is a wearable device that uses artificial intelligence to display the wearer's emotions in augmented reality

Zack Palm
  • 15 august 2017

With augmented reality technology, people started to look at the world a little differently. Though virtual and physical objects could appear to occupy the same space, direct connections between the digital and real worlds remain limited. London-based designer Eun Kyung Shin wanted to try to blend the two together with AR for social interaction, so she came up with a wearable called Hyperface. The device grants the wearer the ability to express emotions more clearly through their face during unexpected moments that happen in a social setting.

The Hyperface is worn like a visor, but includes a transparent screen to flip in front of the eyes. A screen at the top of the visor reflects an image onto the one over the wearer’s eyes, making it look like a digital face. Someone looking at the wearer would see a pair of digital eyes staring back at them, which change based on the wearer’s facial expressions.

The device uses an experimental artificial intelligence to determine the wearer’s emotions with an algorithm. A different set of eyes appears on the display depending on what the algorithm determines that the wearer feels. For example, if the wearer were at a bar with a stranger attempting to talk to them, the wearer’s face might express discomfort, and the Hyperface would amplify it. Of course, not all of the emotions displayed through the Hyperface focus on the negative. If the wearer sees a person they really like, a pair of hearts appears on the visor.

Shin wanted the device to display a person’s emotion’s as closely as possible. She refers to faces we normally put on in public as ‘social masks,’ especially when people deal with intense situations. During these times, not everyone understands how another feels based on they look. The intention of this device was to make it easier to communicate visually, as humans naturally notice key features in a person’s facial expressions and body language.

Shin created Hyperface in the Innovation Design Engineering program at the Royal College of Art in London.

Hyperface

With augmented reality technology, people started to look at the world a little differently. Though virtual and physical objects could appear to occupy the same space, direct connections between the digital and real worlds remain limited. London-based designer Eun Kyung Shin wanted to try to blend the two together with AR for social interaction, so she came up with a wearable called Hyperface. The device grants the wearer the ability to express emotions more clearly through their face during unexpected moments that happen in a social setting.

+AI
+artificial intelligence
+Augmented & Virtual Reality
+augmented reality
+Design
+Emotion
+Europe
+Fashion
+Innovation
+Royal College of Art London
+technology
+UK

Learn About Our Membership Services

Need Research Help?
As a member you can ask us any research questions and get complimentary research assistance with a 4-day turnaround. Reports inclde stats, quotes, and best-inclass examples on research topics.
Remain Informed & Strategic
We publish several trends reports each month. By becoming a member you will have access to over 100 existing reports, plus a growing catalog of deep topical analysis and debrief-style reports so you always remain in the know.
See Trends Come To Life
Meet your peers and immerse yourself in monthly trend and innovation webinars and discounted conferences.
No search results found.