The Computer That Knows How You Really Feel
Artificial intelligence is learning how to improve its emotional IQ.
Open an app called Moodies, created by Israeli company Beyond Verbal, as you're speaking, and it will analyze how you're feeling. Moodies gives feedback on the emotions behind whatever is being said, in real time. It not only reads into your general mood, but it also provides a short description of your attitude and a longer analysis of your "emotional decision-making trajectory."
That last component of the analysis might sound the most vague, but it has the potential to be the most descriptive and useful. Imagine knowing that the person you're speaking to feels "lonely" and as a facet of that emotion has "difficulty accepting authority."
It's enough to make you think Siri is, well, a little cold and distant.
And the app is only the beginning. Beyond Verbal's work in this new field of emotional analytics signals all kinds of interesting business opportunities in the future.
The Business of Emotional Analytics
Beyond Verbal's Moodies app is really just an amuse-bouche, offered free to anyone to get a taste of what the technology can do and have a little fun with it. Because it's not listening to specific words but rather a speaker's vocal modulation, intonation, and pacing, the technology transcends language. (It also, apparently, transcends humans--the company has posted a tongue-in-cheek video of the app analyzing the vocalizations of well-known robot characters, such as R2- D2.)
"It allows all voice-enabled devices to understand emotion and allows them the potential to react to us on an emotional level," says Dan Emodi, the company's vice president of marketing.
If that's true, the business potential goes far beyond a fun app.
Yoram Levanon, the company's chief science officer, began research in the field 19 years ago to discover what, beyond body language, allows prespeech babies and animals to understand human emotion. He teamed up with Yuval Mor, a serial executive, to found Beyond Verbal in 2012. The team raised $3.8 million in venture capital funding, and it thinks it has a big business on its hands.
The business model involves creating and licensing the voice-analysis software to other companies, so they can build useful applications for it. The company predicts its technology will be useful for call centers, to monitor the tone of employees, as well as for being installed into smart hardware, such as watches or appliances, or even airplanes and automobiles. It could be used to help screen job applicants or to calm upset customers.
"Understanding emotions is introducing the most important nonexistent interface out there," Emodi says. "It transcends verticals."
The data that went into creating Beyond Verbal's software includes analysis by physicists and neuropsychologists, as well as self-reported data, on 70,000 test subjects across more than 30 languages.
Now, with the growth in the use of Moodies, more than half a million people have had their speech analyzed, according to Emodi.
This growing field Beyond Verbal operates in, helping computers better understand human emotion, is also known as affective computing. Other companies in the space include nViso, which tracks eye movements, as well as tiny muscle movements in people's faces in reaction to what they're seeing on a screen. A company called Affectiva similarly uses physiological responses to gauge how viewers respond to on-screen advertising.
Fighting the Creepy Factor
It's no surprise this entire field is being met with a raised eyebrow from consumers. When I asked Emodi whether he's gotten response from users that Moodies, or other applications of the Beyond Verbal technology, is "creepy," he said the company has always known that would be a possible reaction, but that's because it is operating at the frontier of computing's possibilities.
"I think every new technology has a little creepy factor," he says. "From the Internet to the invention of fire."
In terms of the actual applications of the technology, Emodi says there's far more benefit to using it to interpret one's own speech than that of others.
"Frankly, we as people are pretty good at understanding other people's emotions," he says. "But what we are not good at is figuring out how we come across. It could help people have better control and connection to their emotional self."
There are potential applications for the broad swaths of data the Beyond Verbal software can collect, but for now the company is limiting access to the data to mostly researchers and nonprofits, for psychological studies, including the relationships between babies and mothers. And it is carefully screening potential uses of the data.
"For instance, we wouldn't license the technology to the North Korean government," Emodi says. "We are trying the best we can to steer clear of anything problematic."
CHRISTINE LAGORIO-CHAFKIN | Staff Writer | Senior Writer
Christine Lagorio-Chafkin is a writer, editor, and reporter whose work has appeared in The New York Times, The Washington Post, The San Francisco Chronicle, The Village Voice, and The Believer, among other publications. She is a senior writer at Inc.