Schedule

July 1st, 2022

All Times in US Eastern Time (UTC-5)

8:20 - 8:30

Alessandro Roncone -- Welcome and Opening Remarks

8:30 - 9:00

Sami Haddadin -- Professor at TUM and founder of Franka Emika presenting virtually

Title: From Cobots to the first certified Tactile Robot

Abstract: The development of robots that can learn to interact with the world and manipulate the objects in it has emerged as one of the greatest and so far largely unsolved challenges in robotics research. Drawing from our work in torque-controlled lightweight robots towards human-safe tactile robots that can manipulate, fly, or drive, I explain the technological quantum leaps that recently have changed this and pushed the boundaries. In particular, this progress was made possible by human-centered design, soft and force-sensitive control, contact reflexes, and model-based machine learning. In the real world, by enabling human-robot coexistence, collaboration, and interaction for the first time, this robotic technology has proven its transformative potential to traditional manufacturing already around the globe by introducing the first certified tactile robot. Increasingly, this real world breakthrough is now impacting professional services, domestic applications, medicine and healthcare.

9:00 - 9:30

Matthew Gombolay -- Assistant Professor at Georgia Institute of Technology presenting in-person

Title: Are Humans Amazing or…Not? Insights for Robot Learning from Human Demonstration.

Abstract: The field of Learning from Demonstration (LfD) has sought to enable humans to intuitively teach robots novel skills without requiring those humans to be proficient in computer programming or robotics. Years of research have shown that robots can be trained by humans in university laboratories. Yet, outside of these controlled experiments, do humans really show the ability to transfer skills or knowledge to robots? Research we are conducting has shown both that (1) we cannot rely on models of “spherical humans” that are homogeneous and perfect, and (2) with the right representations, we can leverage human suboptimality and heterogeneity. In this talk, I will share the work we are doing in proxemic human-robot interaction, share the challenges of working with real humans and convey principles for the role of the human in robotic and reinforcement learning.

Break 10:00 - 10:30

10:30 - 11:00

Harold Soh -- Assistant Professor at the National University of Singapore presenting in-person

Title: Do I Trust My Robot to Give Me a PCR Test?

Abstract: In this informal talk, I'll give a brief overview of our recent work in developing robots that humans trust appropriately, that can communicate relevant information, and that have a sense of touch. We believe these are crucial elements for close-proximity human-robot interaction, yet much work remains. In the spirit of the workshop, we'll discuss some of the key challenges in these three areas and potential ways forward.

11:00 - 11:30

Angelique Taylor -- Assistant Professor at Cornell Tech presenting virtually

Title: Robot-Centric Group Detection and Tracking System (REGROUP)

Abstract: To facilitate the field of Human-Robot Interaction (HRI) to transition from dyadic to group interaction with robots, new methods are needed for robots to sense and understand human team behavior. We introduce the Robot-Centric Group Detection and Tracking System (REGROUP), a new method that enables robots to detect and track groups of people from an ego-centric perspective using a crowd-aware, tracking-by-detection approach. Our system employs a novel technique that leverages person re-identification deep learning features to address the group data association problem. REGROUP is robust to real-world vision challenges such as occlusion, camera ego-motion, shadow, and varying lighting illuminations. Also, it runs in real-time on real-world data. We show that REGROUP outperformed three group detection methods by up to 40% in terms of precision and up to 18% in terms of recall. Also, we show that REGROUP's group tracking method outperformed three state-of-the-art methods by up to 66% in terms of tracking accuracy and 20% in terms of tracking precision. We plan to publicly release our system to support HRI teaming research and development. We hope this work will enable the development of robots that can more effectively locate and perceive their teammates, particularly in uncertain, unstructured environments.

Lunch Break 12:00 - 1:30

1:30 - 2:00


Changliu Liu -- Assistant Professor at Carnegie Mellon University presenting virtually

Title: Cross-platform adaptation for safe and consistent human-robot collaboration

Abstract: Human-robot interaction (HRI) is an important component to improve the flexibility of modern production lines. However, in real-world applications, the task (e.g., the conditions that the robot needs to operate on, such as the environmental lighting condition, the human subjects to interact with, and the hardware platforms) may vary and it remains challenging to optimally and efficiently configure and adapt the robotic system under these changing tasks. To address the challenge, this paper proposes a task-agnostic adaptable controller that can 1) adapt to different lighting conditions, 2) adapt to individual behaviors and ensure safety when interacting with different humans, and 3) enable easy transfer across robot platforms with different control interfaces. The proposed framework is tested on a human-robot handover task using the FANUC LR Mate 200id/7L robot and the Kinova Gen3 robot. Experiments show that the proposed task-agnostic controller can achieve consistent performance across different tasks.

2:00 - 2:30

Zackory Erickson -- Assistant Professor at Carnegie Mellon University presenting virtually

Title: Capacitive Proximity Servoing for Physically Assistive Robotics

Abstract: Accurately sensing the human body is a crucial yet challenging problem for robots that wish to physically interact with and assist people. Sensing the human body using visual features is a common and successful approach in many scenarios, yet visual occlusions are prevalent during close physical human-robot interaction (HRI). In this talk, I will introduce some of our recent work on capacitive sensing that overcomes many visual occlusions and enables robots to sense the human body and track human motion during physical HRI. Through a series of human studies (able-bodied and people with disabilities), I will show how capacitive servoing can benefit physical HRI and robotic caregiving.

Break 3:00 - 3:30

3:30 - 4:00

Andrea Bajcsy -- PhD Candidate at University of California Berkeley presenting virtually

Title: Practical Safety in Close-Proximity Interaction

Abstract: When we navigate and manipulate in close proximity with humans, we need to anticipate human actions. However, to stay safe, we need to be ready for these models to be wrong. In this talk, I will share our latest work in enabling safe interaction in practical ways, that avoid overly conservative robots while still maintaining safety guarantees.

4:00 - 5:00

Panelist Discussion

Siddhartha Srinivasa, Harold Soh, Changliu Liu, Zackory Erickson, Angelique Taylor & Matthew Gombolay