Discrete and Process Automation

Watch: Robot Skin Non-Verbally Displays the Robot’s Feelings

16 July 2018

The robot prototype expresses its "anger" with both its eyes and its skin, which turns spiky through fluidic actuators that are inflated under its skin, based on its "mood." Source: Cornell UniversityThe robot prototype expresses its "anger" with both its eyes and its skin, which turns spiky through fluidic actuators that are inflated under its skin, based on its "mood." Source: Cornell University

Cornell University researchers have developed a robot prototype that can express the “emotions” it has been feeling through visible changes in its skin. The robot’s skin covers a grid of texture units. The texture units can change shape based on what the robot is feeling.

The inspiration for this robot doesn’t come from human emotions. According to assistant professor of mechanical and aerospace engineering and TEDx Talk host, Guy Hoffman, the inspiration for this robot comes from nature and animals.

"I've always felt that robots shouldn't just be modeled after humans or be copies of humans," said Hoffman. "We have a lot of interesting relationships with other species. Robots could be thought of as one of those 'other species,' not trying to copy what we do but interacting with us with their own language, tapping into our own instincts."

The robot’s skin design has an array of two shapes: goosebumps and spikes. The way that these goosebumps and spikes are displayed translates into what the robot is feeling. The actuation units for the shapes are integrated into texture models with fluidic chambers that are connected to the same shaped bumps.

There are two actuation control systems that the researchers tried during the development of the robot skin. The driving factor for these two actuation systems is minimizing size and noise level.

"One of the challenges," Hoffman said, "is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky."

The robot skin has not been tested or applied yet, but the researchers say that proving this can be done is the biggest step. The next step is to develop the technology and test it.

There are a few challenges that are standing in the way of the development of the skin. One challenge is scaling the technology to fit a self-contained robot. Another challenge is making the technology more responsive to a robot’s quick emotional changes. But despite these challenges, the researchers are excited about the robot skin.

"At the moment, most social robots express [their] internal state only by using facial expressions and gestures," said the researchers, "We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction."

The paper on this technology was published in IEEE Spectrum.



Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement