Skip to Main Navigation Skip to Content Skip to Footer Navigation
Give Now

Profiles in Giving

You want a robot that knows how you feel. Don’t you?

  • Nao (pronounced "Now"), pictured right, will recognize you, hear you, and talk to you.
  • Not all serious, Nao can play soccer and dance. It's all in the programming.
  • Tufts researchers are exploring how robots can read people's subtle, nonverbal cues.
  • Given better communications skills, robots could help the elderly or aid with disaster response.
  • Software developed at Tufts lets robots learn new words and actions in real-time contexts.

Just Like Us

Matthias Scheutz spends his day talking to robots. In the future, he hopes we will, too.

Scheutz, a computer scientist at the School of Engineering, notes that robotic devices like the Roomba vacuum cleaner are becoming more and more common in homes and offices. Yet our interaction with them is entirely one-sided—they simply can’t comprehend our words or gestures.

“For a robot to be a truly useful helper to humans, it needs to interact with us on our terms,” says Scheutz, who is also a Bernard M. Gordon Senior Faculty Fellow. That means it needs to master the rules of language, respond to verbal commands, and grasp the nuances of human communication.

Understanding spoken words is a tough job for a machine, let alone parsing hidden meanings and context. Scheutz is trying to change that, however, thanks in part to funding from the Gordon Fund, a 2003 gift that supports faculty development in both the School of Engineering and the Gordon Institute. He and his lab are working on a complex software framework called DIARC (for Distributed Integrated Affect Reflection Cognition), a sort of robot “brain” that will let machines respond to both our language and our subtle social cues.

If engineers can make robotic devices more humanlike in the way they interact with us, Scheutz thinks those robots can start to play more useful roles, like aiding search-and-rescue teams after natural disasters, or caring for elderly patients. He and his team are already starting to develop a “helper” robot for people with Parkinson’s disease, a neurological disorder that can rob patients of fine muscle control.

“Parkinson’s makes it hard for patients to express their emotions,” Scheutz says. “You can’t just look at their face to see if they’re smiling, or listen to their tone of voice to see if they’re in a good mood. You need to figure that out based on the context of what they’re saying.” By using a robot to do the job, it might be possible to create a sort of virtual mediator between doctor and patient, providing caregivers with a better sense of their patients’ emotional state. “Essentially, we want robots to help make up for what patients can’t do,” he says.