About
The HUME's vision of the future is where people are taken out of harsh, extreme environments and replaced by teams of smart robots able to do the 'dirty and dangerous jobs', collaborating seamlessly as a team with each other and with the human operators and experts on-shore. In this new world, remote data collection, fusion and interpretation become central, together with the ability to generate transparent, safe actionable decisions from this data. We are developing a coherent framework that enables humans and machines to work seamlessly as a team by establishing and maintaining a single shared view of the world and each other's intents through transparent interaction, robust to a highly dynamic and unpredictable maritime environments. The HUME project's ambitious and fundamental research programme will address fundamental research questions in the field of machine-machine and human-machine collaboration, robot perception and explainable autonomy and AI.
The Prosperity Partnership would build on a 20 year strategic relationship between SeeByte and HWU, with SeeByte originally a spin-out of Heriot-Watt University in 2001 and now a world-leader in maritime autonomy worldwide. This grant would facilitate a shift to lower TRL research and development, providing seeding for early-stage research that can have a broad, longer-term and more disruptive impact. This proposed work aims at establishing a durable model, through which SeeByte and HWU can remain connected to foster long-term research relationships on projects of interest, as they emerge in this rapidly changing field.
Key Challenges and Objectives
Objective 1 (Team): Enhance teaming through research and development of robotic collaboration algorithms for human-machine, machine-machine and human-machine-machine interfaces through establishing situation awareness including a shared understanding of the task at hand, the environment, and the operator needs.
Objective 2 (Perceive): Enhance robotic perception through advanced machine learning methods for environment semantic tagging and real-time sensor performance estimation, with the view to feed this into the enhanced human-machine-machine teaming.
Objective 3 (Explain): Investigate methods to explain reasoning behind the robotic perception and autonomy, providing greater transparency, which will in turn increase operator confidence and ultimately adoption.