FEMM Hub at UKOMAIN 2025 Exploring Multimodal Data for Manufacturing and End of Life
The Future Electrical Machines Manufacturing (FEMM) Hub was represented at the UKOMAIN 2025 Multimodal Workshop, held at the Barbican Centre in London this September. UKOMAIN brings together researchers working at the interface of computer vision, audio, text and sensor data to explore how multimodal approaches can address complex problems in real-world domains. The workshop was an ideal forum to share FEMM Hub research on action recognition, multimodal inspection and the challenges of ambiguity in manufacturing data.
Two of our researchers presented posters highlighting ongoing FEMM Hub work on electric machine manufacture and end of life pathways. We are delighted that Research Assistant Minoru Dhananjaya Jayakody Arachchige was awarded Best Poster at the workshop for his contribution on multimodal inspection of end of life components. Alongside this success, our team also used the event to represent the Hub more broadly and to connect our activities to the wider multimodal AI community.
⸻
Decoding Ambiguity in Human Actions for Manufacturing
PhD researcher Junxi Zhang presented new work on human action recognition in electric machine manufacturing processes. His research examines how ambiguity arises when different manual operations appear visually similar or overlap in subtle ways, making them difficult for machine learning models to classify reliably. By gathering multimodal data from processes such as thermal crimping during electric machine production and manual disassembly at end of life, his work investigates how combining vision, depth, lidar and wearable sensor data can reduce uncertainty and improve recognition accuracy.
This research contributes to FEMM Hub objectives by addressing the need for robust monitoring of human activities in both production and disassembly contexts. Reliable action recognition can be used to inform operator training systems, provide feedback on task execution, and underpin assembly and disassembly tracking platforms that capture knowledge from human practice and transfer it into future automation.
⸻
Multimodal Inspection of End of Life Components
Research Assistant Minoru Dhananjaya Jayakody Arachchige presented a poster on multimodal inspection of electric machine components at end of life. His work is exploring how sensory data can be collected and interpreted in ways that mimic the exploratory behaviour of skilled human inspectors. A direct perception approach, using fixed multimodal rigs, can provide consistent views of components, while an active perception approach allows the system to adjust position or viewpoint to uncover subtle defects, much as a person would move around a part to judge its condition.
This work also investigates how large language models can be used to integrate multimodal sensor inputs and generate meaningful feedback, observations and decision support. By combining data from vision, depth, acoustic and non destructive sensors within an LLM framework, inspection systems can produce explanations and recommendations that are both technically useful and accessible to human operators. This approach demonstrates how cyber physical inspection methods can evolve into sensory motor style systems that improve decision making around the reuse, remanufacture or recycling of electric machine components. Minoru’s recognition with the Best Poster Award highlights both the quality of this work and its relevance to the challenges of end of life manufacturing.
⸻
By contributing to UKOMAIN 2025 the FEMM Hub demonstrated its commitment to applying multimodal AI and inspection techniques to the manufacturing and end of life challenges of electric machines. We were able to showcase work from our researchers, highlight the role of multimodal datasets in addressing ambiguity, and connect with a wider community working on the future of multimodal intelligence. With the award for Best Poster underscoring the impact of our research, these activities help strengthen the Hub’s position at the forefront of digital and sustainable manufacturing.
For more information about the FEMM Hub’s work or to explore opportunities for collaboration please contact the team at l.j.farnsworth@sheffield.ac.uk.