Call for Abstracts and Posters
FORMAT
Authors are invited to submit extended abstracts in PDF format, following the standard IEEE conference template, by the deadline of October 1st, 2025. Papers accepted by IROS 2025 can be involved in the workshop after submission.
We look forward to your innovative contributions and to facilitating engaging discussions at the workshop.
SUBMISSION INSTRUCTIONS
The manuscript should be submitted via email to z.wang82@lancaster.ac.uk, with the subject form as "[IROS2025 SAT] + Poster Submission + [Your Paper Title]". The manuscripts will be reviewed by at least two reviewers and selected based on their novelty, contribution, relevance, technical clarity, and presentation.Â
A curated selection of submissions will be invited to deliver brief lightning talks during the workshop. The remaining accepted contributions will be presented in the form of posters or live demonstrations. For those selected, the 2-4 page extended abstracts will be published on the workshop's website. In recognition of outstanding contributions, the workshop will award the Best Paper Awards sponsored by the RAS Technical Committee on Collaborative Automation for Flexible Manufacturing (CAFM) to the top abstracts subject to peer review.
Please be aware that the provided poster boards measure 950mm in width and 2340mm in height [TBD]. We recommend preparing your posters in A0 size, in portrait orientation, to ensure they fit the boards appropriately. Authors are responsible for printing their posters and bringing them to the workshop. In addition to poster presentations, we encourage live demonstrations or videos to accompany your presentation. Authors requiring additional resources such as extra space, power, or specific equipment for their demos are advised to contact the organizers well in advance to arrange for these needs.
Submission Open: 1st August 2025
Submission Deadline: 1st October 2025
Acceptance Notification: 8th October 2025
All deadlines are at 23:59 Anywhere on Earth time.
TOPICS OF INTEREST
Topics of interest include, but are not limited to:
AI-driven intent prediction and shared autonomy (e.g., EMG/EEG-based decoding to anticipate operator goals).
Cognitive-state estimation and adaptive assistance for real-time workload reduction.
Digital-twin teleoperation frameworks that fuse physics simulation with live data for predictive control.
Explainable and trustworthy teleoperation AI to enhance user confidence and regulatory acceptance.
Learning from demonstration, imitation, and reinforcement learning tailored to remote manipulation.
Gaze-, speech-, and gesture-based multimodal command interfaces for hands-free control.
Adaptive impedance and variable-stiffness control for safe contact under uncertain dynamics.
Augmented-reality and mixed-reality overlays to enhance depth-perception and situational awareness.
Multisensory feedback illusions (temperature, vibration, auditory, olfactory) for richer embodiment.
Operator-team and multi-operator collaboration models for complex multi-robot missions.
Swarm and continuum-robot teleoperation in confined or unstructured environments.
Resilient teleoperation under degraded or intermittent communications (store-and-forward, predictive buffering).
Human-robot co-learning and lifelong adaptation in long-duration deployments (space, subsea, nuclear).
Fatigue-aware scheduling and ergonomic interface design for extended shifts.
Tele-rehabilitation and assistive-living applications leveraging remote haptics and exoskeletons.
Inclusive and accessible teleoperation interfaces for users with disabilities.
Standardized benchmarks, open datasets, and reproducibility protocols for HIO teleoperation.
Ethical, legal, and socio-technical perspectives on autonomy delegation and human oversight.
Large-language-model (LLM) assistants for natural-language task scripting and tutor-style guidance.
DISCLAIMER
The authors still retain the copyrights for their submissions. This means that the author has full control over the work (e.g. retains the right to republish to conferences/journals, reuse, distribute, etc.).