NeuRobots 2025
Workshop on Neuromorphic Perception for Real World Robotics
In conjunction with IROS 2025
Hanghzhou
October 24th in the morning, Room 102A
In conjunction with IROS 2025
Hanghzhou
October 24th in the morning, Room 102A
Neuromorphic perception offers low-power and low-latency with sparse encoding and inherent information compression. Such perception for robots can potentially increase reactivity in an unconstrained and dynamic environment (i.e. avoid obstacles and grasp dynamic objects), improve mobility by reducing power requirements and reliance on off-board GPU servers, and integrate with biologically inspired artificial intelligence. In the visual domain, event-based sensors facilitate the accurate state estimation of fast-moving robots and objects by eliminating motion blur and data redundancy, thereby enhancing precision during actions. The high-dynamic range increases the robustness and reliability of robots in challenging lighting conditions. The reduction of data in a stationary environment, eliminates the need for computational resources and battery power, further enhancing the capabilities of this technology.
This workshop focuses on real-world applications and demonstrations of event-based cameras and neuromorphic sensors, including perception pipelines and data processing, that go beyond those solely tested in simulated environments. The potential of neuromorphic sensors in robotics has motivated a growing body of literature for solutions to robotic problems, but the amount of work that actually demonstrates the advantages of neuromorphic perception over traditional perception in real-world scenarios still needs to be shown. We encourage the participation and discussion of real-world robotic implementations, task-dependent applications, hybrid (neuromorphic and traditional sensors as well as comparisons between such sensors) systems, and online and real-time algorithms shown in real-world conditions. During the workshop we will discuss the current experimental trends, difficulties, and general solutions for achieving real-world neuromorphic perception for robots.
Full-length conference papers which got accepted at IROS or another related conference/journal. Please use this form to submit your paper and proof of acceptance.
2–4 pages extended abstracts (exclusive references). Please submit your abstract via OpenReview.
A video of the demo with a 1-page description (Add a link to the video in your description). Please submit your demo via OpenReview.
Abstracts and Demos will be peer-reviewed to ensure a quality standard.
Important note regarding submissions via OpenReview: New profiles created without an institutional email will go through a moderation process that can take up to two weeks. New profiles created with an institutional email will be activated automatically.
This workshop aims to foster the growth of event-based research by gathering researchers in the field and enhancing communication between academia and industry, with the goal of discovering new, bleeding-edge neuromorphic technologies.
Representations for event-based data
Model-based algorithms
Learning-based algorithms
Spiking neural networks
Bio-inspired computational methods
Event-based spatio-temporal feature extraction
Learning methodologies with event data
ASIC and FPGA-based implementations
Novel circuitry designs
Benchmarking and characterisation of event-based cameras
Submission Deadline
August 24, 2025 (AoE)
Decision to Authors
September 21, 2025
Date of the Workshop
October 24, 2025
08:30 - 08:45: Introduction: Organizing Chair
08:45 - 09:05: Rising Star Speaker 1: Luna Gava
09:05 - 09:25: Rising Star Speaker 2: Suman Ghosh
09:25 - 09:45: Debate/Interactive Session with Audience
09:45 - 10:15: Accepted paper Spotlight Talks
10:15 - 11:00: Coffee break, Poster and Demo Session
11:00 - 11:20: Speaker 3: Guido De Croon
11:20 - 11:40: Speaker 4: Mina Khoei
11:40 - 12:00: Speaker 5: Jörg Konradt
12:00 - 12:20: Debate/Interactive Session with Audience
12:20 - 12:30: Closing
University of Tübingen
Italian Institute of Technology
University and ETH Zürich
University of Lübeck
Czech Technical University in Prague
University of Manchester
Jingyue Zhao
Qiyuan Lab