Schedule

Schedule:

08:50 -- 09:00 | Workshop opening

09:00 -- 09:30 | Jeremy Fishel : Applications in Touch: Dexterity and Perception

09:30 -- 10:00 | Edward Adelson : The Virtues of Soft Fingers that Extract High Resolution Contact Geometry

10:00 -- 10:15 | Poster spotlight

10:15 -- 10:30 | Contributed talk - Christopher Atkeson : Optical Skin For Robots: Tactile Sensing and Whole-Body Vision

10:30 -- 11:00 |--------------------- Poster session, demo and coffee break------------------------

11:00 -- 11:30 | Oliver Kroemer : Learning to Monitor and Adapt Manipulation Skills based on Tactile Events

11:30 -- 12:00 | Sergey Levine : End-to-End Learning of Perception and Control

12:00 -- 13:30 | ----------------------------------Lunch break------------------------------------------

13:30 -- 14:00 | Alberto Rodriguez : Real-Time Contact-Aware State Estimation

14:00 -- 14:30 | Robert Howe : Combining Tactile Sensing and Grasp Analysis to Predict Stability

14:30 -- 15:00 | Matei Ciocarlie : Accurate Contact Localization with Few Wires

15:00 -- 15:30 | --------------------Poster session, demo and coffee break------------------------

15:30 -- 16:00 | Veronica Santos : Experiential approaches to artificial haptic perception and decision-making

16:00 -- 16:30 | Robert Haschke : Tactile Sensors and Tactile Processing for Human Data Acquisition and Robot Grasping

16:30 -- 17:00 | Charles Kemp : Haptic Sensing for Assistive Robots

17:00 -- 18:00 | Panel discussion


Invited Speakers:

Jeremy Fishel (SynTouch Inc.)

  • Title: Applications in Touch: Dexterity and Perception
  • Abstract: As robots are increasingly expected to replicate and replace human behavior, they must increasingly behave like humans. This includes replicating the sense of touch. We have developed the BioTac, the world's first compliant tactile sensor that is capable of sensing the full range of forces, vibrations and temperatures the human fingertip can perceive. Its design incorporates fingerprints, a fingernail and heat generation, all of which have been shown to improve the sensitivity and functionality of the device. However, such a biomimetic sensor is just the admission price to a much more difficult set of challenges in developing intelligent reflexes to control movement in response to touch (perception for action) and useful exploratory movements to illicit tactile sensations to effectively identify objects by touch (action for perception). In this talk we discuss current research and future applications of such a technology including characterizing how objects feel in a way that relates to human perception, using this information to drive novel tactile displays, reflexes that enable fragile grasping, and next-generation applications in telerobotic systems.
  • Bio: Jeremy Fishel is a co-founder and the Chief Technology Officer at SynTouch, which manufactures biomimetic tactile sensors and provides tactile sensing solutions for quantifying the perception of touch and improving the dexterity of robotic and prosthetic hands. Dr. Fishel received a B.S. in Mechanical Engineering (‘05) from California State University Long Beach, two M.S. degrees (Biomedical Engineering in ’07; Aerospace and Mechanical Engineering in ’09) from the University of Southern California, and a PhD from the Biomedical Engineering department at USC (’12) for his work on fluid-based tactile vibration sensing and the development of Bayesian exploration. He has been recognized by Popular Mechanics as one of the 2013 Innovators of the year and accepted as a delegate of the Academy of Achievement in 2014 under the personal recommendation of General David Petraeus. Under his leadership, SynTouch has been recognized as a Technology Pioneer by the World Economic Forum, an RBR50 company by the Robotics Business review as well as numerous awards for the company’s biomimetic tactile sensor (the BioTac®), which mimics the sensory capabilities of the human fingertip. Dr. Fishel’s professional interests are in pioneering applications in tactile perception and robotic dexterity using SynTouch’s core technology.


Edward Adelson (MIT)

  • Title: The virtues of soft fingers that extract high resolution contact geometry
  • Abstract: People often think that the goal of a touch sensor is to measure patterns of local force. But one can argue that force (especially normal force) is less interesting than geometry. Measuring geometric aspects of the contact interaction helps you infer object identity, object pose, and material properties such as hardness, and texture; it also lets you detect slip. People often think that compliant sensors, being soft and squishy, must provide poor information about contact geometry. However, optically based compliant sensors, including GelSight, can provide high resolution at high precision, as well as measuring both normal and tangential displacements. Indeed, soft fingers are needed, not only to support robust grasping, but to allow the extraction of rich geometry-based information. I will describe a variety of tasks we have pursued using GelSight sensors.
  • Bio: Edward (Ted) Adelson is the John and Dorothy Wilson Professor of Vision Science at MIT, in the Department of Brain and Cognitive Sciences, and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). He has made many contributions to the fields of human vision, machine vision, computer graphics and neuroscience. He has won multiple awards, and is a member of the National Academy of Sciences. He is now developing artificial touch sensors designed to match or exceed the capabilities of human touch.


Oliver Kroemer (USC)

  • Title: Learning to Monitor and Adapt Manipulation Skills based on Tactile Events
  • Abstract: Contact states are fundamental to manipulation tasks, as they determine which objects the robot's actions will directly affect. A change in the contact state, e.g., making, breaking, or slipping contacts, often corresponds to a subgoal of the task or an error depending on the context. In this talk, I will discuss methods for learning to detect these types of contact events using tactile sensing. I will also explain how the robot can use this contact information to monitor and adapt its manipulation skills in order to perform the tasks more robustly.
  • Bio: Oliver Kroemer is a postdoctoral researcher at the University of Southern California (USC), working together with Gaurav S. Sukhatme in the Robotic Embedded Systems Lab (RESL). His research interests are in machine learning and robotics, with a focus on learning grasping and manipulation skills. He received his Masters and Bachelors degrees in engineering from the University of Cambridge in 2008. He was a Ph.D. student at the Max Planck Institute for Intelligent Systems from 2009 to 2011. In 2014, Oliver defended his Ph.D. thesis at the Technische Universitaet Darmstadt. He was a finalist for the 2015 Georges Giralt Ph.D. Award for the best robotics Ph.D. thesis in Europe. In 2015, he first worked as a postdoctoral researcher at TU Darmstadt before starting his current position at USC.


Sergey Levine (UC Berkeley)

  • Title: End-to-End Learning of Perception and Control
  • Abstract: Estimation and perception have traditionally been considered as separate problems from decision making and control, particularly when dealing with complex, high-bandwidth sensory modalities such as vision and touch sensing. I will present several robotic learning results that demonstrate that, by coupling control and perception more closely using end-to-end training with high-capacity representations such as deep networks, we can obtain improved coordination and sensory feedback for robotic manipulation systems. I will present results that illustrate how grasping and object manipulation can be improved through end-to-end joint training of perception and control, show how predictive models can be constructed using just raw sensory inputs and then used for generating manipulation behaviors, and discuss how combining perception and control can lead to new methods for imitation learning. Finally, I will conclude by discussing future directions to combine tactile sensing with decision making and control in an end-to-end framework.
  • Bio: Sergey Levine received a BS and MS in Computer Science from Stanford University in 2009, and a Ph.D. in Computer Science from Stanford University in 2014. He joined the faculty of the Department of Electrical Engineering and Computer Sciences at UC Berkeley in fall 2016. His work focuses on machine learning for decision making and control, with an emphasis on deep learning and reinforcement learning algorithms. Applications of his work include autonomous robots and vehicles, as well as computer vision and graphics. His research includes developing algorithms for end-to-end training of deep neural network policies that combine perception and control, scalable algorithms for inverse reinforcement learning, deep reinforcement learning algorithms, and more.


Alberto Rodriguez (MIT)

  • Title: Real-Time Contact-Aware State Estimation
  • Abstract: My main goal in this talk is to motivate the need of contact sensing in robotic grasping and manipulation. I’ll start by briefing on recent work by team MIT-Princeton in the Amazon Robotics Challenge, and the lack of practical solutions that exploit feedback and contact sensing. Vision is capable of efficiently providing a global perspective on an object or scene. However it is not sensitive to the high dynamic range of contact interactions such as contact/no-contact, sliding, slipping or grasping. It also falls short in cluttered scenes or tight workspaces, where the end-effector often occludes the object. Tactile sensing, on the other hand, provides accurate local information, which unfortunately is difficult to interpret and integrate in a useful global perspective. I’ll describe recent work on incorporating tactile and vision feedback using a popular framework for solving the SLAM problem -- iSAM (incremental smoothing and mapping) to provide fast and reliable estimates of manipulated objects. I’ll finish by describing on-going efforts toward integrating tactile sensors and state estimators in the Amazon Robotics Challenge scenario.
  • Bio: Alberto Rodriguez is the Walter Henry Gale (1929) Career Development Professor at the Mechanical Engineering Department at MIT. Alberto graduated in Mathematics ('05) and Telecommunication Engineering ('06) from the Universitat Politecnica de Catalunya (UPC) in Barcelona, and earned his PhD in Robotics (’13) from the Robotics Institute at Carnegie Mellon University. He spent a year in the Locomotion group at MIT, and joined the faculty at MIT in 2014, where he started the Manipulation and Mechanisms Lab (MCube). Alberto is the recipient of the Best Student Paper Awards at conferences RSS 2011 and ICRA 2013 and Best Paper finalist at IROS 2016. His main research interests are in robotic manipulation, mechanical design, and automation.


Robert Howe (Harvard)

  • Title: Combining Tactile Sensing and Grasp Analysis to Predict Stability
  • Abstract: We are exploring the use of grasp analysis to understand the role of tactile sensor data in validating and controlling grasp stability. One result is a method to identify the spatial range of contact measurements that can guarantee grasp stability. This provides a quantitative relationship between stability prediction abilities and tactile sensor resolution. The analysis shows that limited spatial resolution leads to situations where the tactile data is uncorrelated with grasp stability. Another line of investigation is comparing grasp stability predictions derived from analytical models to those derived from machine learning methods. Preliminary results suggest that both approaches perform poorly when inadequate sensory information is available, which explains the limited success found in many recent publications. These results also suggest that model-based grasp stability assessment can provide adequate performance without the need to collect the extensive training data required for machine learning-based approaches.
  • Bio: Robert D. Howe is Abbott and James Lawrence Professor of Engineering in the Harvard Paulson School of Engineering and Applied Sciences. Dr. Howe founded the Harvard BioRobotics Laboratory in 1990, which investigates the roles of sensing and mechanical design in motor control, in both humans and robots. His research interests center on manipulation, the sense of touch, and human-machine interfaces. A major focus is the development of image-guided and robotic surgical applications. Dr. Howe earned a bachelors degree in physics from Reed College, then worked as a design engineer in the electronics industry in Silicon Valley. He received a doctoral degree in mechanical engineering from Stanford University in 1990, and then joined the faculty at Harvard. Among his honors are election as a Fellow of the IEEE and AIMBE, and best paper awards at mechanical engineering, robotics, and surgery conferences. Lab web site: http://biorobotics.harvard.edu/


Matei Ciocarlie (Columbia University)

  • Title: Accurate Contact Localization with Few Wires
  • Abstract: In this talk, I will present an overview of robotic tactile sensing research carried out in multiple collaborating labs at Columbia University. I will then present in detail our work on accurate contact detection and localization over complex, 3D geometry. As a general principle, we embed numerous sensors in a volume of soft material, then build a data-driven mapping from the rich data they produce to the variables of interest, such as contact location or indentation depth. We aim for manufacturing methods that can simplify future integration in robotic palms and fingers (e.g. low wire count, no rigid and flat substrates, etc.). We have so far demonstrated this general, data-driven approach with three different underlying transduction mechanisms (piezorezistance, optics, and MEMS strain gauges), and I will discuss the unique characteristics of each as well as the lessons learned.
  • Bio: Matei Ciocarlie is a faculty in Columbia University. His main interest is in reliable robotic performance in unstructured, human environments, focusing on areas such as novel robotic hand designs and control, autonomous and Human-in-the-Loop mobile manipulation, shared autonomy, teleoperation, and assistive robotics. He is also interested in novel hand designs that combine mechanical and computational intelligence, and make use or tactile, proprioceptive or range sensing in novel ways. Before joining the Mechanical Engineering faculty at Columbia, Matei was a Research Scientist and then Group Manager at Willow Garage, Inc., and then a Senior Research Scientist at Google, Inc. In these positions, Matei contributed to the development of the open-source Robot Operating System (ROS), and led research projects in areas such as hand design, manipulation under uncertainty, and assistive robotics.


Veronica J. Santos (UCLA)

  • Title: Experiential approaches to artificial haptic perception and decision-making
  • Abstract: In the first part of the talk, I will present a multimodal, microfluidic tactile sensor skin that was developed in collaboration with the University of Washington. The resistive-based sensor uses a liquid metal alloy encapsulated in a PDMS elastomer to measure normal and shear forces, and is highly sensitive to vibration. In the second part of the talk, I will discuss efforts to develop haptic perception and decision-making capabilities for robots using machine learning and reinforcement learning techniques. For the closure of a deformable ziplock bag, I will describe how robot actions are selected based on prior experiences, the current context, and a functional task goal in a resource-conscious manner. I will briefly describe new projects on the perception of motion of a handheld object to infer interactions with the environment, and the development of a multimodal model for locating objects buried in granular media.
  • Bio: Veronica J. Santos is an Associate Professor in the Mechanical and Aerospace Engineering Department at the University of California, Los Angeles, and Director of the UCLA Biomechatronics Lab. Dr. Santos received her B.S. in mechanical engineering with a music minor from the University of California at Berkeley (1999), was a Quality and R&D Engineer at Guidant Corporation, and received her M.S. and Ph.D. in mechanical engineering with a biometry minor from Cornell University (2007). While a postdoc at the University of Southern California, she contributed to the development of a biomimetic tactile sensor for prosthetic hands. From 2008 to 2014, Dr. Santos was an Assistant Professor in the Mechanical and Aerospace Engineering Program at Arizona State University. Her research interests include human hand biomechanics, human-machine systems, haptics, tactile sensors, machine learning, prosthetics, and robotics for grasp and manipulation. Dr. Santos was selected for an NSF CAREER Award (2010), two ASU Engineering Top 5% Teaching Awards (2012, 2013), an ASU Young Investigator Award (2014), and as a Defense Science Study Group participant (2018-2019), and an NAE Frontiers of Engineering Education Symposium participant (2010). She currently serves on the Editorial Boards for the ASME Journal of Mechanisms and Robotics and the IEEE International Conference on Robotics and Automation.


Robert Haschke (Bielefeld University)

  • Title: Tactile Sensors and Tactile Processing for Human Data Acquisition and Robot Grasping
  • Abstract: At Bielefeld University we have developed a variety of tactile sensors, ranging from large tactile arrays over 3D-shaped tactile fingertips to flexible fabrics-based ones. In the talk, I will introduce the sensor designs, propose a ROS-toolbox for tactile data processing and visualization, and provide and overview of some applications for tactile-based grasping and manipulation, including tactile servoing, tactile surface exploration, slip detection, grasp stabilization, and prosthesis control.
  • Bio: Robert Haschke received the diploma and PhD in Computer Science from the University of Bielefeld, Germany, in 1999 and 2004, working on the theoretical analysis of oscillating recurrent neural networks. Since then, his work focuses more on robotics, still employing neural methods where ever possible. Robert is currently heading the Robotics Group within the Neuroinformatics Group at Bielefeld University, striving to enrich the dexterous manipulation skills of our two bimanual robot setups through interactive learning. His fields of research include neural networks, cognitive bimanual robotics, grasping and manipulation with multi-fingered dexterous hands, tactile sensing, and software integration.


Charles Kemp (Georgia Institute of Technology)

  • Title: Haptic Sensing for Assistive Robots
  • Abstract: Mobile manipulators with autonomous capabilities have the potential to provide 24/7 personalized assistance for people with disabilities. Haptic sensing could be valuable to future assistive robots in a number of ways. In this talk, I will present research on haptic sensing that we have conducted at the Healthcare Robotics Lab at Georgia Tech. I will describe our work on data-driven models of forces during robot-assisted dressing, feeding, shaving, and door opening. I will also give an overview of our research on whole-arm tactile sensing for reaching in clutter, including reaching locations around a person’s body. In addition, I will report on our recent work on thermal tactile sensing, which is well-suited for recognizing contact with people.
  • Bio: Charles C. Kemp (Charlie) is an Associate Professor at the Georgia Institute of Technology in the Department of Biomedical Engineering with adjunct appointments in the School of Interactive Computing and the School of Electrical and Computer Engineering. He earned a doctorate in Electrical Engineering and Computer Science (2005), an MEng, and BS from MIT. In 2007, he founded the Healthcare Robotics Lab ( http://healthcare-robotics.com ). His lab focuses on mobile robots for intelligent physical assistance in the context of healthcare. He has received a 3M Non-tenured Faculty Award, the Georgia Tech Research Corporation Robotics Award, a Google Faculty Research Award, and an NSF CAREER award. He was a Hesburgh Award Teaching Fellow in 2017. His research has been covered extensively by the popular media, including the New York Times, Technology Review, ABC, and CNN.