Ever wish your computer could read your mind? A group of MIT graduate students has created an interface that comes remarkably close to that.

They've built prototypes of a headset-like device that takes input from the neuromuscular activity involved in speech. Much like Alexa or other voice-activated artificial intelligence devices, the device takes that speech and turns it into action, through a small library of applications. The headset can solve complicated arithmetic or text a friend. And--here's the cool thing--you don't actually have to vocalize the commands to work it.

As you silently mouth commands, a separate, back-of-headset transmitter uses bone-conduction audio to feed answers back to the wearer, without blocking your eardrum or dampening regular hearing. 

"The experience is like having the entire internet in your head, and a little AI agent, who can do things for you, perched on your shoulder," said Arnav Kapur, the 24-year-old master's student at the MIT Media Lab who came up with the idea for the system, which he calls AlterEgo.

Kapur started thinking about AlterEgo three years ago, and began to work on it in earnest in 2017. What he saw as the culture's increasing smartphone addiction partly inspired the device. 

"I think as much as our devices enable us, they unplug us from our actual environment," he said. "Our interaction with computing feels like it's not designed for us." He wondered: What if you could have a computer that was an invisible extension of your own thoughts?

Reading brain activity was not impossible, but Kapur thought it far too intrusive--and perhaps unhelpful (there's a lot of random noise in there). Reading speech was faster than texting, and he realized the same electrical signals from muscle movement flowed from the inside of the mouth to the brain whether or not words were actually uttered.

Silent speech was a breakthrough. "You could be in a meeting or on a voice call and be secretly conducting another call--or you could quickly search for a term you don't know," Kapur said. "Or if you're meeting with someone who does not speak English, the headset could translate for you."

AlterEgo also could work as a secret remote-control for any internet-connected device--or, rather, turn the wearer into a human remote control. 

Kapur and his team at MIT now are tackling the challenge of building in additional utility, in the form of apps and a wider library of recognized words to their technology. He's also not yet satisfied with the design of the clunky first-generation prototype, and is reworking it to conform better to the wearer's skin. His goal is to make one that would be nearly invisible: both noninvasive and almost imperceptible. Whether other humans will be able to tolerate silently muttering to themselves--or watching others do so--remains to be seen. 

While this innovation is still in research phases, Kapur says that since he published a paper on it in March he's gotten plenty of interest from investors. Will AlterEgo become a real company? He's not talking--not audibly at least--about that yet.

"We're thinking about our options," was all he'd say.

Published on: Nov 5, 2018