National Science Foundation (NSF), Mind, Machine and Motor Nexus (M3X): "NSF-SNSF: VR-HRC: Virtual Reality-based Multi-Human-Multi-Robot Collaboration in Industrial Environments" (Award Number: 2523599), 9/15/2025 - 8/31/2028, PI: Maria Kyrarini, Total amount: $442,383
National Science Foundation (NSF), Disability and Rehabilitation Engineering (DARE): "Collaborative Research: DARE: A Personalized Assistive Robotic System that assesses Cognitive Fatigue in Persons with Paralysis" (Award Number: 2226165), 10/1/2022 - 9/30/2025, PI: Maria Kyrarini, Total amount: $277,388 (includes REU support of $12,358 and STEM Access for Persons with Disabilities (STEM-APWD) of $43,413)
National Science Foundation (NSF), Cyber-Hyman Systems (CHS), "WORKSHOP: Doctoral Consortium at the PETRA 2020 Conference" (Award number: 2022456), 3/31/2020 – 3/31/2021, co-PI: Maria Kyrarini (PI: Fillia Makedon), Total amount: $29,814
School of Engineering's Kuehler Undergraduate Research Award (student: Shreya Chandragiri): "An Intelligent and Interactive Robot Pet", 6/1/2025-10/1/2025, PI: Maria Kyrarini, Total amount: $9,479
School of Engineering's Kuehler Undergraduate Research Award (student: Maya Murphy): "Ergonomic Human-Robot Handovers using Surface Electromyogram (sEMG) Sensors", 6/1/2024-10/1/2024, PI: Maria Kyrarini, Co-PI Davoudi, Total amount: $8,475
Faculty-mentored Undergraduate Research Support (2FURS): "A knowledge-based task planning approach for robot multi-object Manipulation", 11/1/2024 – 6/30/2025, PI: Maria Kyrarini, Total amount: $1,000
University Research Grant: "Control Framework for a Socially Assistive Robot - Funded by SCU University Research Grant", 1/1/2024 – 12/31/2024, PI: Maria Kyrarini, Total amount: $5,000
School of Engineering's Kuehler Undergraduate Research Award (student: Matt Tognotti): "Personalization in Human-Robot Interaction", 6/1/2023-10/1/2023, PI: Maria Kyrarini, Total amount: $8,825
Faculty Student Research Assistant Program (FSRAP): "Developing an Interactive Game with a Small Humanoid Robot", 1/9/2023 – 6/30/2024, PI: Maria Kyrarini, Total amount: $1,000
University Research Grant: "Intelligent Hands-free Multimodal Interface for Human-Robot Interaction", 6/16/2022 – 6/30/2023, PI: Maria Kyrarini, Total amount: $5,600
NSF-SNSF: VR-HRC: Virtual Reality-based Multi-Human-Multi-Robot Collaboration in Industrial Environments- Funded by National Science Foundation - NSF: 2025 - 2028 (Estimated)
My Role: Principal Investigator (September 2025 - Present)
Objective: This joint National Science Foundation - Swiss National Science Foundation (NSF-SNSF) project aims to advance research in multi-human-multi-cobot collaboration in industrial settings. Collaborative Robots (Cobots) are increasingly being deployed in factories to assist workers with repetitive tasks or heavy lifting. Typically equipped with a Graphical User Interface and simulation software, these cobots are designed to be easily programmed by non-experts. However, while programming a single robot may be straightforward, coordinating multiple cobots to collaborate effectively with human workers can present challenges in a factory environment. During assembly processes, these robots often work alongside humans, helping to lift heavy components or provide necessary tools and materials as required. To effectively teach cobots their tasks, human operators must have a solid understanding of three-dimensional spatial processes. Simultaneously, the cobots must learn to interpret and adapt to human actions within their workspace. To address these challenges, this project introduces a Virtual Reality (VR) framework that facilitates collaboration between humans and cobots. The system aims to empower multiple users to interact using hand gestures, eye gaze, and speech within a physics-based VR simulation environment, thereby simplifying the process of teaching robots industrial tasks. The collaboration between Santa Clara University in the USA and the Dalle Molle Institute for Artificial Intelligence of the Scuola Universitaria Professionale della Svizzera Italiana in Switzerland will foster knowledge transfer and provide student researchers with cultural understanding, ultimately strengthening international research collaborations. This project will develop intent detection methods that leverage multimodal human demonstrations to generate actions for multiple collaborative robots (cobots) automatically. It involves creating an innovative physics-based virtual reality (VR) environment tailored for multi-cobot factory settings and employing generative AI techniques to model, generalize, and streamline the embodied reasoning between humans and multi-cobots. Additionally, the method incorporates human-in-the-loop approaches to refine the robot's behavior. The project encompasses three key thrusts: 1. Developing a VR interface for multi-human-multi-cobot collaboration, which will facilitate the collection of a multimodal multiperson dataset. 2. Establishing an intelligent framework that advances research in embodied reasoning and human-in-the-loop methodologies for the refinement of multi-cobot behavior. 3. Evaluating the proposed methods and systems through user studies, including a small pilot study with physical robots in real-world settings.
Collaborative Research: DARE: A Personalized Assistive Robotic System that assesses Cognitive Fatigue in Persons with Paralysis - Funded by National Science Foundation - NSF: 2022 - 2026 (Estimated)
My Role: Principal Investigator (October 2022 - Present)
Objective: With the advancements in robotics and artificial intelligence, assistive robotic systems have the potential to provide support and care to people with Spinal Cord Injury (SCI). As robots become more widespread, like today’s mobile phones, assistive robots can play a significant role in assisting persons with disabilities at home, improving independence and everyday quality of life. The objective of this project is to design and develop an end-to-end personalized assistive robotic system, called iRCSA (Intelligent Robotic Cooperation for Safe Assistance), to recognize, assess, and respond to a human’s cognitive fatigue during human-robot cooperation. The focus of the system is on human-robot cooperative tasks where a human with SCI and a robot cooperate during daily tasks (e.g., cooking). Students who have experienced SCI will be involved in every stage of the project to ensure the acceptability and usability of the proposed system. In addition to the significant impact of this research on the improvement of life independence for persons with disabilities, the project includes the development of new university courses for assistive technologies and summer school programs for K-12 students, so that students gain knowledge on robotics and assistive technologies for their prospective studies in Science, Technology, Engineering, and Math (STEM).
Related Publications: [1][2][3][4][5][6][7][8]
Videos: [1]
Collaborative Cooking
PFI:BIC: iWork, a Modular Multi-Sensing Adaptive Robot-Based Service for Vocational Assessment, Personalized Worker Training and Rehabilitation - Funded by National Science Foundation - NSF: 2017 - 2021
My Role: Postdoctoral Research Fellow (August 2019 - July 2021)
Objective: Automation, foreign competition, and the increasing use of robots replacing human jobs stress the need for a major shift in vocational training practices to training for intelligent manufacturing environments, so-called "Industry 4.0". In particular, vocational safety training using the latest robot and other technologies is imperative, as thousands of workers lose their job or die on the job each year due to accidents, unforeseen injuries, and lack of appropriate assessment and training. The objective of this project is to build a smart robot-based vocational assessment and intervention system to assess the physical, cognitive, and collaboration skills of an industry worker while he/she performs manufacturing tasks in a simulated industry setting and collaborating with a robot to do a task. Data collected and analyzed come from sensors, wearables, and explicit user feedback measuring worker movements, eye gazes, errors made, performance delays, human-robot interactions, physiological metrics, and others, depending on the task.
Collaborative Assembly (Source)
CHS: Large: Collaborative Research: Computational Science for Improving Assessment of Executive Function in Children - Funded by National Science Foundation - NSF: 2017 - 2021
My Role: Postdoctoral Research Fellow (August 2019 - July 2021)
Objective: The identification of cognitive impairments in early childhood provides the best opportunity for successful remedial intervention, because brain plasticity diminishes with age. Attention deficit hyperactivity disorder (ADHD) is a psychiatric neurodevelopmental disorder that is very hard to diagnose or tell apart from other disorders. Symptoms include inattention, hyperactivity, or acting impulsively, all of which often result in poor performance in school and persist later in life. In this project, an interdisciplinary team of computer and neurocognitive scientists will develop and implement transformative computational approaches to evaluate the cognitive profiles of young children and to address these issues. The project will take advantage of both physical and computer based exercises already in place in 300 schools in the United States and involving thousands of children, many of whom have been diagnosed with ADHD or other learning disabilities. Project outcomes will have important implications for a child's success in school, self-image, and future employment and community functioning. The PIs will discover new knowledge about the role of physical exercise in cognitive training, including correlations between individual metrics and degree of improvement over time. They will identify important new metrics and correlations currently unknown to cognitive scientists, which will have broad impact on other application domains as well. And the PIs will develop an interdisciplinary course on computational cognitive science and one on user interfaces for neurocognitive experts. Find out more about the Activate Test of Embodied Cognition (ATEC) system.
The ATEC System (Source)
MobILe Physical Human-Robot-Interaction for Independent Living (in German: Physische Mensch-Roboter-Interaktion für ein selbstbestimmtes Leben) - Funded by National German Federal Ministry of Education and Research - BMBF: 2017-2020
My Role: Research Associate (July 2017 - December 2018)
Objective: The goal of the MobILe project is the research and realization of basic skills with and without direct physical contact between robot and human. For robot control in three-dimensional space, humans use head and eye movements, which are recorded via a headset with movement sensors or glasses with eye tracker and electrooculography. Augmented reality (e.g. in the form of visual representations of the robot's intended actions) is used to interact with humans. The user-centered interaction design minimizes loss of attention. A safety system with redundancies ensures functional safety. In the case of basic physical contact skills, new control strategies are used to ensure that the interaction is carried out until a clear override order is issued.
Robot Learning from People with Quadriplegia (Source)
MeRoSy Human-Robot Synergy (in German: Mensch-Roboter Synergie) - Funded by National German Federal Ministry of Education and Research - BMBF: 2015-2018
My Role: Research Associate (June 2015 - December 2018) & Project Manager (April 2017 - December 2018)
Objective: In the project MeRoSy, alternative input options for controlling robots via head movements are to be researched and implemented. The assistance system to be developed is based on methods of machine learning with the aim of learning to solve new tasks or to adapt existing solutions to new boundary conditions. To this end, the problem-solving behavior of humans is analysed and simulated on the basis of event sequences. Based on existing solutions and the evaluation of human intervention, an evolutionary extension of the robot's capabilities is created. The special requirements for data protection and data security are taken into account when collecting data on humans and their environment.
Collaborative Assembly (Source)
CORBYS Cognitive Control Framework for Robotic Systems - Funded by EU FP7-ICT: 2011-2015
My Role: M.Sc. Thesis Student (September 2013 - July 2014) & Research Assistant (January 2014 - August 2014 & November 2014 - January 2015)
Objective: The focus of the CORBYS project is on robotic systems that have symbiotic relationship with humans. These systems have to cope with highly dynamic environments as humans are demanding, curious and often act unpredictably. CORBYS will design and implement a cognitive robot control architecture that allows the integration of 1) high-level cognitive control modules, 2) a semantically-driven self-awareness module, and 3) a cognitive framework for anticipation of, and synergy with, human behavior. These modules will be supported with an advanced multi-sensor system to facilitate dynamic environment perception. This will enable the adaptation of robot behavior to the user’s variable requirements.
Related publications: [1]
The CORBYS system (Source)