This Wearable Hears You Even Without Speech

This Wearable Hears You Even Without Speech

This concept device from the MIT Media Lab detects muscle movements so wearers don't have to make a sound to communicate with artificial intelligence

Leo Lutero
  • 11 april 2018

Improvements in Google Assistant, Siri and other voice assistants have convinced us its great to be able to talk to our phones. An MIT project is trying to further this experience with AlterEgo, a device word on the head. With it, there will be no need to make a sound in order to be heard by our smartphones.

The technology stops just short of reading the wearer’s mind. Electrodes of the device detect fine muscle movements as the wearer “subvocalizes.” Subvocalizing is the practice of “talking inside your head,” something many people do when reading slowly. This practice moves actuators in the vocal chords and other parts without making audible sounds or visible motion.

The device detects these movements and interprets them using artificial intelligence. As musculoskeletal structures and movements vary from person to person, the device requires calibration for each new wearer. Neural networks correlate movements detected to the words.

Arnav Kapur, the graduate student leading the development of the device, says the interface can improve its accuracy as it is being used. Starting with 16 electrodes, in an unsightly mesh worn around the neck, further experimentation trimmed it down to just four. These can be embedded in a wearable which looks a lot like an oversized headset.

The device also features bone conduction speakers, to provide sound that only the wearer can hear. The technology sends vibrations to the bones which also vibrates the eardrum.

A glaring advantage of such a system is the discreteness. In an experiment, AlterEgo was able to coach in a chess game without it being obvious. Kapur told MIT News he wants to build a new class of devices, commenting, “The motivation for this was to build an IA device—an intelligence-augmentation device. Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”


Lead Image: Jimmy Day | CC | Image cropped

+artificial intelligence
+bone conduction
+MIT Media Lab

The Latest


PSFK’s CXI 2018 conference brings to life key trends in customer experience through talks and activations by pioneers at well known and new companies.

May 18, 2018 | New York City

Ayah is the founder and CEO of littleBits, an open source library of electronic modules that snap together with tiny magnets for prototyping, learning, and fun. Using step-by-step instructions, littleBits offers easy to follow DIY electronics kits consisting of tiny circuit boards engineered to be combined in order to perform custom functions. Named as one of CNN’s Top 10 Emerging Startups to Watch, littleBits won Ayah a spot on Fast Company’s 100 Most Creative People and Popular Mechanic’s 25 Makers Who Are Reinventing the American Dream. She is also an alumna of MIT Media Lab and a TED Senior Fellow.


At PSFK 2017, Peloton Co-Founder and COO Tom Cortese discussed how the company aims to utilize the power of Web 2.0 to marry the comfort of home fitness to the high-energy engagement of celebrity-run classes.

September 27, 2017
No search results found.