Perspectives on human-aware navigation
Workshop at ICSR2019 Madrid
November 26, 2019
As robots are becoming better at navigating in dynamic environments, their operating areas start to include humans. Furthermore, there exist more and more robots with a dedicated social task involving humans. With these developments, robots need to acquire skills for sharing a physical space with humans. This includes navigating in a human-friendly way and adhering to social rules. As a minimum, robots need to take the personal space of humans into account and respect social distances. In psychology there is a broad discipline studying social distances, called proxemics. It is relevant to investigate how robots can take our personal space into account. Several studies have investigated the relation between personal space and comfortable approach distances of a robot with varying results. From a more technological perspective, several navigational algorithms have been developed that take humans into account.
The main goal of the current workshop is to bring researchers within the field together to allow them to share their view on the topic. Furthermore, several different perspectives of human-aware navigation will be highlighted to try and find the common ground. This includes a psychological perspective on proxemics and human-aware navigation, and also a more technological perspective focusing on human-aware navigation algorithms. Additionally, there will be focus on a perspective on experimental research where human-robot proxemics is investigated. Lastly, the aim is an overview paper together with the attendants of the workshop on the different perspectives of human-aware navigation and publish this in a relevant journal.
The targeted audience are researchers working on robot navigation in human environments. We welcome researchers with different backgrounds: engineering, psychology and anything in-between. We also welcome researchers with different levels of expertise.
The workshop is of interest to researchers because it will show different perspectives on human-aware navigation and human-robot proxemics with the aim to provide new insights. Furthermore, we have invited speakers for each perspective who are all prominent researchers in their field.
Home Companion Robots need to be able to support users in their daily living activities and to be socially adaptive. They should take account of users' individual preferences, environments and social contexts in order to behave in a socially acceptable manner and to gain acceptance into the household and adapt to users' requirements. In this talk, I will share our findings from various human-robot proxemic studies in domestic environments which suggested that participants’ preferences for robot approach distances and orientation do change according to the social context of the robot’s approaches.
Based on these findings, a proof of concept Context-aware Proxemics Planner was developed. This allows our robot to adapt its approach distances and orientations to the user in terms of personal space, based on contextual information regarding the task, user and robot.
Developing robot behavior that appears natural to human observers in dynamically changing environments requires detailed knowledge of how humans interact with their environments. But this knowledge is hard to come by as many human processes are automatic and subconscious. Furthermore, the knowledge that is available is often piece-meal and qualitative, or worse still, it is unclear what knowledge is needed in the first place.
In this presentation I will present how we approached this problem for human-robot proxemics: By developing novel experimental paradigms that combine subjective evaluations with objective psychophysical measurements we can test and validate models for human-robot proxemics. In doing so, we and others, have learned that Hall’s classical model of personal space is too simple for interacting dynamically with robots. People respect much larger or smaller distances depending on the type of interaction, type of robot and the activity of the human interactant. Other experiments have shown that in multi-party settings individual personal spaces do not simply add up. By experimentally validating models for human-robot proxemics using well-designed experiments it is possible to develop quantitative models that are as close as possible to “natural” behavior.
Human-aware navigation has received considerable attention in the last decade. Since the first studies, new technologies to detect, model and represent humans have been made available. Techniques to model the space surrounding the robots and the humans’ personal space have also been made available. Despite these great advances, it is good practice to consider emerging technologies which could be used in the field of human-aware navigation, as well as widely known technologies which might be currently underused. The talk will focus on two technologies that could be considered underused in the context of human-aware navigation: automated task planners and graph neural networks. Automated planning techniques span a wide range of algorithms that aim at finding a suitable sequence of actions that would modify the world in a way that a goal condition is satisfied. Potentially because of the additional expertise and the computational resources required, only a few works have used task planning for human-aware navigation. Graph neural networks (GNNs) -the second technology that will be covered in the talk- are a family of machine learning algorithms that can learn from graph-like structured data. Because of the complex and structured nature of the data involved, GNNs are a useful tool to perform regressions and classifications in human-robot interaction. The talk will provide examples of applications to spark discussions and new research directions.
We claim that navigation in human environments can be viewed as cooperative activity especially in constrained situations. Humans concurrently aid and comply with each other while moving in a shared space. Cooperation helps pedestrians to efficiently reach their own goals and respect conventions such as the personal space of others.
To meet human comparable efficiency, a robot needs to predict the human intentions and trajectories and plan its own trajectory correspondingly in the same shared space. In this work, I we present a reactive navigation planning that is able to plan such cooperative trajectories.
Sometimes, it is even necessary to influence the other or even force him/her to act in a certain way.
Using robust social constraints, potential resource conflicts, compatibility of human-robot motion direction, and proxemics, our planner is able to replicate human-like navigation behavior not only in open spaces but also in confined areas. Besides adapting the robot trajectory, the planner is also able to proactively propose co-navigation solutions by jointly computing human and robot trajectories within the same optimization framework. We demonstrate richness and performance of the cooperative planner with simulated and real world experiments on multiple interactive navigation scenarios.
As a stranger approaches us, there comes a point where we start to feel uncomfortable and intruded upon. Our feeling of an inappropriately far or close distance with respect to another person can be conceived of as a personal space. Although many factors determining the size of personal space have been identified, some important properties such as the shape, regulation, and maintenance of personal space have rarely been investigated, probably because the experimental variation of these properties is particularly challenging.
Latest developments in the domain of virtual reality have enabled us to study these properties of personal space within virtual encounters. Subjects are immersed in virtual environments and engage in social interactions with a virtual person. Studying personal space in such VR-settings allows us to control for many confounding variables and relatively easy manipulation of stimuli while maintaining external validity.
In our studies, we have found that personal space has an approximately round shape in real and in virtual environments. Furthermore, personal space is strongly regulated in response to social cues (e.g. angry vs. happy facial expression) and may be enlarged in encounters with uncanny virtual people, such as puppets. Considering our findings, we suggest that personal space is best characterized as a psychological field: an extension of the self which enables the person to interact with the surrounding social environment. This field is highly flexible and adaptable to personal and situational constraints within social interactions.
In daily lives, we need to be able to efficiently navigate through our social environment. Our brain has developed a plethora of mechanisms that allow smooth social interactions with others, and that enable understanding of others’ behaviors, and prediction of what others are going to do next. At the dawn of a new era, in which robots might soon be among us at homes and offices, one needs to ask whether (or when) our brain uses similar mechanisms towards robots. In our research, we examine what factors in human-robot interaction lead to activation of mechanisms of social cognition and interpreting the intentionality in interaction partners. We use methods of cognitive neuroscience and experimental psychology in naturalistic protocols in which humans interact with the humanoid robot iCub. Here, I will present results of several experiments in which we examined the impact of various parameters of robot social behavior on the mechanisms of social cognition. We examined whether mutual gaze, gaze-contingent robot behavior, or human-likeness of movements influence social attunement. Our results show an interesting interaction between more “social” aspects of robot behavior and fundamental processes of human cognition.
Date: November 26, 2019
09:30 - 10:00 Coffee and welcome, short introduction of the topics and the program
10:00 - 11:00 Topic 1: Experimental approach on human-robot proxemics
11:00 - 11:15 Coffee break
11:15 - 12:15 Topic 2: Technological perspective on human-aware navigation algorithms
12:15 - 13:30 Lunch break
13:30 - 14:30 Topic 3: Psychological perspective on proxemics, navigation, mental models and social cognition
14:30 - 14:45 Coffee break
14:45 - 16:00 Interactive session: group discussion and start writing structure of overview paper
16:00 - 16:30 Closing the workshop + drinks
The number of elderly people living at home is increasing and this trend is expected to continue. The Multimodal Elderly Care Systems (MECS) project addresses these issues by user centered design of robotic systems and the development of adaptive technology. A part of this will be to demonstrate the benefits regarding both performance and privacy being improved by applying sensors like cameras on a robot companion rather than having them permanently mounted in a home.
This project is funded by the Research Council of Norway, IKTPLUSS, under grant agreement 247697.
The FAST project (new Frontiers in Autonomous Systems Technology) aims to develop mobile robots with increased flexibility that enables them to operate in changing environments. In order to be more mobile and flexible, the robots need so-called ‘semantic world models’. ‘Semantic’ means that a robot can give meaning to observations, and recognize situations and objects, enabling it to react more appropriately to specific situations. Robots will become more and more integrated into people’s lives and people will increasingly interact and engage with robots. Therefore, interaction with people is a key component within the semantic world model. The project is a result of a TU/e High Tech Systems Center survey carried out among Dutch high-tech industries in order to identify robotics-related challenges or problems for which TU/e’s expertise is required.
The project is funded by Eindhoven University of Technology and the companies Lely Industries, Vanderlande Industries, ExRobotics (ImProvia), Diversey and Rademaker.