The Expressive Android Child

Editorials News | Nov-26-2018

The Expressive Android Child

Japan's affection for robots is known to all. In 2011, Affetto, a ‘child-type android’, was created by researchers at the University of Osaka. It was a taut-skinned doll with a moving mouth and eyes, not realistic at that time. But today, Affetto can convey emotion with skin that seems to contort naturally and looks like a Japanese Chucky doll but without body.

In a new paper, the researchers behind Affetto explain their technique for upgrading their latter-day Pinocchio. Seven years before, when the original Affetto was created there were many tech outlets  who got impressed by its realistic facial expressions, but later the researchers explained the importance of developing a face that was even more real. A trio of researchers at Osaka University has now found a method for identifying and quantitatively evaluating facial movements on their android robot child head. However in the past, robots in Japan have featured in advances in healthcare, industrial but bringing humanistic expression in a robotic face remains a big challenge for them. Although their system properties have been generally addressed, androids' facial expressions have not been examined in detail due to various factors like the huge range and asymmetry of natural human facial movements, the restrictions of materials used in android skin, and of course the intricate engineering and mathematics driving robots' movements. The researchers have now found a system to make the second-generation Affetto more expressive. Their findings offer a path for androids to express greater ranges of emotion, and ultimately have deeper interaction with humans. The researchers investigated 116 different facial points on Affetto to measure its three-dimensional movement. They underpinned facial points by so-called deformation units where each unit comprises a set of mechanisms that create a distinctive facial contortion, such as lowering or rising of part of a lip or eyelid. This allowed them to develop a system that can help them make it look more real through better control of the synthetic skin. The researchers has reported that surface deformations was the main issue in controlling android faces which created problem in movements of their soft facial skin create instability. The researchers encountered various challenges in balancing the applied force and adjusting the synthetic skin but they were able to employ their system to adjust the deformation units for precise control of Affetto's facial surface motions. The latest version of Affetto also has an asymmetrical face especially in the eyes but the new Affetto is somehow less scary than its rigid gloss-faced predecessor but it looks like more confusing.

By: Anuja Arora

Content: https://www.sciencedaily.com/releases/2018/11/181115104632.htm


Upcoming Webinars

View All
Telegram