Brian Scassellati is a Professor of Computer Science, Cognitive Science, and Mechanical Engineering at Yale University and Director of the NSF Expedition on Socially Assistive Robotics. His research focuses on building embodied computational models of human social behavior, especially the developmental progression of early social skills. Using computational modeling and socially interactive robots, his research evaluates models of how infants acquire social skills and assists in the diagnosis and quantification of disorders of social development (such as autism). His other interests include humanoid robots, human-robot interaction, artificial intelligence, machine perception, and social learning.
Dr. Scassellati received his Ph.D. in Computer Science from the Massachusetts Institute of Technology in 2001. His dissertation work (Foundations for a Theory of Mind for a Humanoid Robot) with Rodney Brooks used models drawn from developmental psychology to build a primitive system for allowing robots to understand people. His work at MIT focused mainly on two well-known humanoid robots named Cog and Kismet. He also holds a Master of Engineering in Computer Science and Electrical Engineering (1995), and Bachelors degrees in Computer Science and Electrical Engineering (1995) and Brain and Cognitive Science (1995), all from MIT.
Dr. Scassellati’s research in social robotics and assistive robotics has been recognized within the robotics community, the cognitive science community, and the broader scientific community. He was named an Alfred P. Sloan Fellow in 2007 and received an NSF CAREER award in 2003. His work has been awarded five best-paper awards. He was the chairman of the IEEE Autonomous Mental Development Technical Committee from 2006 to 2007, the program chair of the IEEE International Conference on Development and Learning (ICDL) in both 2007 and 2008, the program chair for the IEEE/ACM International Conference on Human-Robot Interaction (HRI) in 2009, and chair of the Annual Meeting of the Cognitive Science Society in 2014.
TITLE: ....
ABSTRACT: ...
Prof. Raphaëlle N. Roy (PhD, Habil.) is Professor of neuroergonomics and physiological computing at ISAE-SUPAERO, University of Toulouse, France. She leads an interdisciplinary research at the cross-roads of cognitive science, neuroscience, machine learning and human-machine interaction (HMI). Her main research focus is to investigate how to better characterize operators’ mental state to enhance HMI and improve safety and performance. To this end, she develops methods to extract and classify relevant features from physiological data. Associate editor of the journal Frontiers in Neuroergonomics, co-founder and vice-president of the French BCI association CORTICO, she has authored more than 50 publications and has also recently published a public database for the passive BCI community and organized the first passive BCI competition.
TITLE: Electrophysiological markers for adaptive interaction with robots
ABSTRACT: Electrophysiological signals offer valuable information for human monitoring, providing objective, real-time measurements of mental states. Indeed, thanks to wearable systems that record for instance cerebral activity (EEG), cardiac activity (ECG or PPG) or electrodermal activity (EDA or GSR), several mental states can be objectively evaluated during the interaction with a complex or even critical system. For this talk, examples will be given regarding the estimation of cognitive states such as mental workload and mental fatigue during the interaction with robots and drones. Preliminary work on closing the loop with an adaptation of the task and the interface to the user state will also be provided. Recommendations regarding several steps of the design and implementation of such a loop will be issued.
Stefan Karl-Heinz Ehrlich - Stefan Ehrlich received his Bachelor’s degree from the Baden Wuerttemberg Cooperative State University in electrical engineering and his Master’s degree and PhD in electrical engineering and computer science from the Technical University of Munich in Germany. In his doctorate, he worked on passive brain-computer interfaces (BCI) for augmentation of human-robot interaction in form of online assessment and adaptation of robotic systems during interactions with humans. Stefan also contributed to research in the domains of easy-to-use wearable EEG-based neurotechnology and music-based closed-loop neurofeedback BCIs for affect regulation. As a visiting scientist in the Simonyan Lab, Harvard Medical School, he focuses on developing and implementing BCIs for the treatment of focal dystonia using non-invasive neurofeedback and real-time transcranial neuromodulation.