Would you be more receptive to working with a robot if the robot--like a human--knew when to make direct eye contact, and when to look away?

Sean Andrist, a PhD student at the University of Wisconsin-Madison, is building robots that can do just that, according to an article by Evan Ackerman on IEEE Spectrum's award-winning robotics blog.

His thesis is straightforward. When chatting, humans use their eyes to send tacit signals about conversational turn-taking. We tend to look away when we start speaking. In groups, we tend to look toward the person whom we believe should speak next. We often need to be reminded to make eye contact in situations like job interviews. We tend to look away when we're distracted, shy, or uncomfortable. 

All of these tendencies are called "social gaze behaviors." Andrist's research is about how to program robots with proper social gaze behaviors. The idea is that a robot with properly attuned social gaze behaviors will be more capable of communicating smoothly with a human.

To conduct his research, Andrist uses a three-step methodology. First, he studies human social gaze behavior by observing human conversations. Second, based on the data from these conversations, he creates an algorithm he can implement on a robot platform. Third, he evaluates the robot's use of its programmed social gaze behavior; that is, he tests whether the robot's social gaze behavior really does improve interactions with people. 

In a three-minute summary of his thesis on YouTube, Andrist explains that--so far in his research--conversations between humans and robots have become much more fluid, after the robots were programmed with proper social gaze behavior. Humans and robots interrupted each other less frequently, and humans reported enjoying the conversation much more.

Practical applications, he says, could be robots that are better at helping people assemble furniture, rehab injuries, take medication, or simply get around when they are elderly or unable to walk. The idea is that if a robot can read and react to your social gaze--and respond with a gaze or look-away of its own--it will have a better understanding of when you're vulnerable and need help (as opposed to when you're just fine without its services). 

Joelle Renstrom, who teaches a seminar on robots and artificial intelligence at Boston University, says Andrist's research represents just one way that socially programmed robots have become more advanced in their understanding of people. "Social robots have already redefined the landscape when it comes to, for example, teaching autistic children how to identify facial expressions and non-verbal cues, leading to more appropriate socializing," she says. 

The larger sociocultural conversation about introverts and extroverts is another arena where a robot's ability to read and react to human cues can help, she notes. The type of robot Andrist is working on has the potential to "orient itself to what it (accurately) perceives as the user's social disposition," she says. Down the line, this could lead to some potentially profound and far-reaching questions, such as: Might a robot be better at drawing out a shy introvert than another human? "Whatever the implications, this robot makes it clear that these shifts are coming--and probably more quickly than people realize," she adds. 

In his profile of Andrist's research, Ackerman points out that, indeed, gaze is one of the best ways to tell what kind of "vert" someone is. "Extroverts tend to look at the people they're talking to significantly more than introverts do," he writes. The overall idea is that a properly programmed robot can use your gaze data to interact more comfortably and effectively with you, based on how introverted or extroverted you are in a given day, week, or situation. 

To learn more about Andrist's research, you can read a paper he co-authored last year called "Look Like Me: Matching Robot Personality via Gaze to Increase Motivation." And if a three-minute video is more your speed, you'll really enjoy Andrist's presentation, which is embedded below. 




Published on: Feb 2, 2016