‘Super expressive robot ‘Eva’ can mimic eerie human facial expressions

A new autonomous robot called EVA can respond to match the expressions of nearby humans.

Researchers at the Creative Machines Lab at Columbia Engineering have been working for five years to create EVA.

Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering) and director of the Creative Machines Lab said: "The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes.”

Lipton noted a similar trend in the supermarket, where he saw restocking robots that wore name badges and in one case, a hand-knitted hat.

"People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name," he said.

"This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?”

While sounding like a simple idea, creating a convincing robot face has been made difficult, partly due to the stiff materials like metal or hard plastic that robotic body parts tend to be made of.

Adding to this is the fact that robot hardware – such as circuits, sensors and motors – are heavy and bulky.

The first phase of Eva began in Lipson’s lab several years ago when undergrad Zanwar Faraj led a team of students in building the robot's physical “machinery."

  • Boris Johnson and Carrie Symonds wed in 'secret wedding with friends and family'

EVA can express the six basic emotions of anger, disgust, fear, joy, sadness, and surprise as well as more nuanced emotion using cables and motors that pull on point her EVA’s face like artificial muscles.

Faraj said: "The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions.

Where EVA differs from animatronic robots used in theme parks and for films, is by using AI to “read” and mirror the expressions on nearby faces which it learns by trial and error from watching videos of itself.

Boyuan Chen, Lipson's PhD student who led the software phase of the project, and his team filmed hours of footage of EVA making a series of random faces.

Eva then learned to pair muscle motion with the video footage of its own face.

EVA may be just a lab experiment for now, but the technology could have really world applications in workplaces, hospitals, schools and homes.

"There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers," said Lipson.

"Our brains seem to respond well to robots that have some kind of recognizable physical presence.”

Chen added: "Robots are intertwined in our lives in a growing number of ways, so building trust between humans and machines is increasingly important."

The research will be presented at the ICRA conference today (May 30).

  • Students
  • Robots

Source: Read Full Article