In the next 10 years, with the rapid popularization of home information appliances, home robots equipped with AI hubs will necessarily rise fast and come into people’s daily lives. Among all of these, for now, audio-based AI assistants such as Google Home and Siri are widespread, but when users are interacting them, there is no sense of interaction, so they are hardly used except used as music players. With regard to home robots, there are many opinions that people don’t want a machine with no feelings and no emotions in their houses. Therefore, for home robots, they should not only finish their job perfectly, but also let humans feel secure and trust when home robots can read the emotions of people through their gestures and dialogue and respond accordingly. On the other hand, The expression of emotional behavior by UNESCO intangible cultural heritages Bunraku puppet is considered as "the most beautiful motion in the world". The Tayū's narration and the shamisen (Jo-Ha-Kyū) are so affective that the audience is often moved to tears by the puppet performances.
This research develops one kind of communicative robot using Bunraku puppet’s sophisticated motion expressions and the Japanese music principle so-called Jo-Ha-Kyū. We used both motion capture data of puppets and robot motion data designed by professional expert to train a deep neural network which can generate affective robot motions. (If you have interest in combining Japanese traditional performing arts Bunraku with advanced technology, please click here for further information.)
Handmade robot motions designed by a professional expert (萩原佳明).
Affective robot motions generated by deep neural network using both captured puppet motions and handmade robot motions as training data. (Yang Chen, Ran Dong, Dongsheng Cai, Shinobu Nakagawa, Tomonari Higaki, and Nobuyoshi Asai. “The Beauty of Breaking Rhythms: Affective Robot Motion Design Using Jo-Ha-Kyū of Bunraku Puppet.” In Proceedings of SIGGRAPH’19 Talks, Los Angeles Convention Center, Los Angeles, CA, USA, July 28-August 01, 2019.)
JST Basic Research Program (ACT-I Information and Communication Technology, Information and Future), Research Subject “Human-Puppet Interaction: Affective Motion Design Using Bunraku Puppets”, Research Participant, 2,000,000 yen in 2019
The Telecommunications Advancement Foundation (Research Investigation Grant), Research Subject “Human-Puppet Interaction: Bunraku Puppets and Affective Motion Design Using Deep Learning”, Co-investigator, 1,500,000 yen in 2019
Graduate School of System and Information Engineering, University of Tsukuba (Young Researcher Development Program), Research Subject “Development of a Communicative Robot Using AI: Affective Robot Motion Design Based on Bunraku”, 200,000 yen in 2020