http://2cccd5dfe1965e26adf6-26c50ce30a6867b5a67335a93e186605.r53.cf1.rackcdn.com/Robot Wrap_Emily Farah_SOC.mp3
A robot that can understand human social situations is a technological curiosity that has been explored in many entertainment classics, like Star Wars, Star Trek, and Doctor Who. The exploration into the possibility didn’t stop there. WALL-E, I Robot, and other pop-culture motion pictures have looked into the possibility of artificial intelligence interacting with humans.
Carnegie Mellon University’s (CMU) Robotics Institute is taking that curiosity one step further. Researchers developed a method to analyze where humans’ gazes meet one another in order to apply the knowledge they gain about social situations to robots.
Groups of people with cameras mounted on their heads were put in a room while their gazes were observed. The researchers could tell if an individual was listening to one sole person or conversing with a group based on where that individual’s eyes were focused and the accompanying body signals.
The study expanded to a work-setting, a musical performance, and a party where participants engaged in ping pong and billiards. The video from the head-mounted cameras gathered detail about the general direction of the person’s gaze, as well as the focus in the line of sight, such as on the ping pong ball as it traveled from one player’s paddle to another.
Hyun Soo Park, a Ph.D. student of mechanical engineering at CMU, said one of the reasons for doing the research was to address the lack of understanding robots have for human social contexts.
“Artificial intelligence has a pretty good understanding of the surrounding environment, however, artificial systems are totally blind to social signals such as information from the facial expression, gaze direction, and body gestures,” Park said.
Park said the findings of the study can be used for technology in robots for search and rescue teams, in the medical field, and in the military. Park added the research could be utilized in autistic studies.
“Autism children show the different behavioral patterns they respond to social signals differently than other non-autistic children,” Park said. “We are developing a confrontational tool to understand the relationship between the behavior and the attention through our work.”