SCHEDULE

FULL DAY

Sunday, October 23rd, 2022

Time zone: JST (Japan Standard Time: UTC +9)

8:30 - 9:00

Registration

9:00 - 9:15

Welcome & Opening Speech

9:15 - 10:30

Session 1: Design Approach of Large-area Skin for Soft and Hard Robots

15 mins for each

(talks + Q&A)

Electronic Skin for Robotics and Wearables

by Prof. Takao Someya

Abstract (click to show):

The human skin is a large-area, multi-point, multi-modal, stretchable sensor, which has inspired the development of electronic skin for robots that simultaneously detect pressure and thermal distribution. By improving its conformability, the application of electronic skin has expanded from robots to next-generation wearables for human, reaching a point where ultrathin semiconductor membrane can be directly laminated onto the skin. Such intimate and conformal integration of electronics with the human skin allows continuous monitoring of health conditions for a long time, enabling personalization of medical care. The ultimate goal of the electronic skin is to non-invasively measure human activities under natural conditions, enabling electronic skin and human skin to interactively reinforce each other. In this talk, I will review recent progress in stretchable thin-film electronics for applications to robotics and next-generation wearables for medical applications and address issues and the future prospect of electronic skin.

Dense 3-Axis Sensors (uSkin) for Sensitive Grasping

by Prof. Alex Schmitz

Abstract (click to show):

I will provide an overview of the tactile sensors developed in our lab. The focus will be on uSkin, magnetic 3-axis tactile sensors in a thin, soft, durable package, with minimal wiring. I will also discuss on how they have been used for various applications such as in hand-manipulation and grasp stability assessment. Safety (with capacitive sensors) will also be shortly introduced.

Fabric-based Proximity/Contact Sensors Applied to Two-DOF Variable Stiffness Mechanism

by Prof. Shinichi Hirai

Abstract (click to show):

This talk presents fabric-based proximity/contact sensors with their application to two-DOF variable stiffness mechanism. Fabric sensors exhibit both bending and extensional deformations, allowing us to use the sensors as soft robot skins. Variable stiffness links, which consist of elastic tubes and a pneumatic system, change their stiffness according to external signals. Consequently, introducing fabric-based sensors to variable stiffness links contributes to safety in human-robot interaction.

One barrier of fabric-based proximity/contact sensors is interference among them. When fabric sensors are attached to multiple links of a variable stiffness mechanism, they might interfere one another, failing to detect proximity/contact. This take provides our approach to compensate the interaction among sensor signals to cope with the interference. Experiments using a two-DOF variable stiffness mechanism show how the proposed approach works.

What can a Robot's Skin Be? A Design Perspective for Human-Robot Interaction

by Prof. Guy Hoffman

Abstract (click to show):

Contrary to biological skin, which can be a medium of expression, a robot’s skin is usually regarded as a passive and static separation between the body and environment. In our research, we explore the design opportunities of a robot’s skin as a socially expressive medium along with a flexible technical method for embodiment. I will demonstrate the proposed design space with six texture-changing skin prototypes and discuss their expressive capacities.

High Performance and Multi-functional Tomographic Tactile Sensors for Large-Scale Robotic Skin

by Prof. Yoshimoto Shunsuke

Abstract (click to show):

Electrical Impedance Tomography (EIT) is an imaging technology that reconstructs the impedance distribution of an arbitral conductor from the potential data for multiple excitation conditions by using multiple electrodes located on the boundary of the conductor. Focusing on design flexibility of EIT, the presenter has proposed various tomographic tactile sensors to demonstrate a flexible detector structure, high spatial-temporal performance, and multimodal sensing. In this talk, the presenter explains the principles and characteristics of the proposed tomographic approaches, and discusses potential applications.

10:30 - 11:00

Panel Discussion

moderated by Prof. Van Anh Ho

11:00 - 11:15

Coffee Break

11:15 - 12:15

Interactive Session 1

11:15 - 11:35

Lightning talks (2 mins each)

11:35 - 12:00

Poster and Research/Project Demonstration

12:00 - 13:00

Lunch Break + Demonstration

13:00 - 13:15

Opening speech

13:15 - 14:30

Session 2: Processing Method for Large-scale Sensing

15 mins for each

(talks + Q&A)

Neuromorphic Tactile Sensing

by Dr. Chiara Bartolozzi

Abstract (click to show):

Biological sensory systems have developed to best capture the properties of surrounding objects and environment that are useful for acting in the world. The physical properties of tactile receptors and the way neurons encode the characteristics of each stimulus allow our brain to make sense of the world and take appropriate decisions on how to behave.

When grasping a glass of water, our hand automatically adjusts the force used to stably hold the glass depending on its size, weight, roughness, slippery, softness. This is done by a very efficient system that spares the slightest bit of information, to avoid consuming too much energy for each single action. As such, artificial systems have much to learn from biology, to develop cheap solutions that can run in a very small device and at minimum energy cost. This is especially true in robots equipped with large-scale robotic skin, where the activity of thousands of sensors covering the body is usually sparse in space and time.

In this talk I will present the neuromorphic approach to tactile sensing for robots, from the design of mixed-mode subthreshold circuits for different transducers, to spike-based processing of their signals.

High-resolution tactile sensing using Machine Learning

by Dr. Georg Martius

Abstract (click to show):

Machine Learning enables us to develop tactile sensors differently because physical contact can be measured more indirectly than when using analytical data-processing methods. In my talk, I will showcase two examples of this paradigm: A vision-based haptic sensor for all-around perception of contact forces on a fingertip and a tactile skin using barometric measurement units. The latter is designed using a theory for super-resolution haptic sensing and achieves stunning performance, such as a 1000-fold higher contact localization accuracy compared to the sensor grid. This theory is also applicable to other physical sensing paradigms and we hope that it will help to build new high-performance large-scale sensing devices.

TBA

by Dr. Lucia Beccai

Abstract (click to show):


Integrative Tactile Skin Using Computation

by Dr. Hyosang Lee

Abstract (click to show):

Tactile skin is becoming a fundamental component of autonomous robots to perceive their surrounding environments physically. Nonetheless, these sensors are yet stagnated due to challenges in system-level integration caused by material compatibility, fabrication simplicity, robustness, as well as their high sensing performance. Biological skin shows a good example of achieving various functionalities using overlapping receptive field structure and cognitive processing. My research showcases a robotic tactile skin that mimics a human's efficient tactile perception mechanism using piezoresistive materials and computation. This approach simplifies the sensor design enabling integrative robotic skin for future applications.

Printed Electronic Skin With Learning Capability

by Prof. Ravinder Dahiya

Abstract (click to show):

Electronic skin (e-skin) in robotics and interactive systems is expected to behave similarly to our own skin in terms of sensation and perception. This means that e-skin should not only has similar morphology, but should also the similar information processing capabilities as our own skin. This is important considering the highly distributed nature of tactile sensing and the limited communication bandwidth for data transmission. This talk will present the artificial skin with tactile sensors, printed synaptic transistors and peripheral circuits, developed to bestow the skin with bio-like learning capability. The presented e-skin system emulates several biological principles, including event-driven sensing, associated learning, and forgetting behaviour. The printed electronics route followed here for the development of synaptic transistors is also as attractive feature of presented work, particularly from the point of view of large area electronics.

14:30 - 15:00

Panel Discussion

moderated by Perla Maiolino

15:00 - 15:15

Coffee Break

15:15 - 16:15

Session 3: Interaction and Control

15 mins for each

(talks + Q&A)

TBA

by Prof. Giorgio Cannata

Abstract (click to show):


Sensing Tactile Contact Over Large, Soft Surfaces

by Dr. Katherine Kuchenbecker

Abstract (click to show):

Robots should be able to feel contacts across all of their body surfaces, not just at their fingertips. Furthermore, tactile sensors need to be soft to cushion contact and support the transmission of tangential force and torque. Today's robotic systems rarely have such sensing capabilities because artificial skin tends to be complex, bulky, rigid, delicate, and/or expensive. Taking inspiration from other successful sensor designs, my collaborators and I have created four families of soft sensors that can feel contact forces across their large surfaces. This talk will quickly mention our work on Insight (an all-around tactile sensor that uses vision and machine learning), ERTac (tactile sensing via electrical resistance tomography), and HERA (the Haptic Empathetic Robotic Animal). Then I will carefully describe HuggieBot 3.0, one version of the responsive human-sized hugging robot with visual and haptic perception that Alexis Block created for her doctorate. The pressure sensor and microphone inside HuggieBot's inflated torso chamber detect complementary streams of information about physical contact; the robot quickly and accurately perceives the user's actions during an embrace by processing these data streams with simple machine learning.

Electronic Skin for Robotics and Wearables

by Prof. Gordon Cheng

Abstract (click to show):

Whole-body tactile interactions with a humanoid robot require a sophisticated sensory system, several challenges exist from covering a large area of the surfaces with a huge array of sensors to the optimization of processing a large amount of sensory data. This presentation includes our latest results on large-scale sensing multi-modal tactile e-skin on interactions with a full-sided autonomous humanoid robot. I will show rich interaction of the humanoid robot performing many complex human-like tasks that is typically difficult on usual robotic systems.

Building Caregiving Robots: A Tale of Three Sensing Modalities

by Dr. Tapomayukh Bhattacharjee

Abstract (click to show):

How do we build robots that can assist people with mobility limitations with activities of daily living? To successfully perform these activities, a robot needs to be able to physically interact with humans and objects in unstructured human environments over large contact surfaces. Multimodal sensing can enable a robot to rapidly infer properties of contact with its surroundings. This talk will showcase how a robot can use the interplay between force, thermal, and visual sensing modalities during manipulation to perceive properties of these physical interactions using data-driven methods and physics-based models. I will also touch upon some of our recent efforts in developing a simulation platform for caregiving robots "RCareWorld", that enables realistic physical interactions across the whole robot arm with virtual human avatars built using clinical data.

16:15 - 16:45

Panel Discussion

moderated by Prof. Veronica Santos

16:45 - 17:00

Wrap-up

by organizers