This Wearable Hears You Even Without Speech
This concept device from the MIT Media Lab detects muscle movements so wearers don't have to make a sound to communicate with artificial intelligence
Improvements in Google Assistant, Siri and other voice assistants have convinced us its great to be able to talk to our phones. An MIT project is trying to further this experience with AlterEgo, a device word on the head. With it, there will be no need to make a sound in order to be heard by our smartphones.
The technology stops just short of reading the wearer’s mind. Electrodes of the device detect fine muscle movements as the wearer “subvocalizes.” Subvocalizing is the practice of “talking inside your head,” something many people do when reading slowly. This practice moves actuators in the vocal chords and other parts without making audible sounds or visible motion.
The device detects these movements and interprets them using artificial intelligence. As musculoskeletal structures and movements vary from person to person, the device requires calibration for each new wearer. Neural networks correlate movements detected to the words.
Arnav Kapur, the graduate student leading the development of the device, says the interface can improve its accuracy as it is being used. Starting with 16 electrodes, in an unsightly mesh worn around the neck, further experimentation trimmed it down to just four. These can be embedded in a wearable which looks a lot like an oversized headset.
The device also features bone conduction speakers, to provide sound that only the wearer can hear. The technology sends vibrations to the bones which also vibrates the eardrum.
A glaring advantage of such a system is the discreteness. In an experiment, AlterEgo was able to coach in a chess game without it being obvious. Kapur told MIT News he wants to build a new class of devices, commenting, “The motivation for this was to build an IA device—an intelligence-augmentation device. Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”