I am currently a postdoc at CNRS (LIRMM IDH team), working with Prof. Abderrahmane Kheddar. I received my Ph.D. at City University of Hong Kong, where my advisors were Prof. Jia Pan and Prof. Yajing Shen. Before coming to CityU, I received my B.S. in North China Electric Power University in 2016 and studied at Xi'an Jiaotong University for two years, where my advisor was Prof. Yugang Duan.
My research interests lie in developing soft tactile sensors and relevant algorithms to help robots better perceive and interact with the physical world.
Email: youcan.yan@lirmm.fr | Google Scholar Page
2025-06-09: I serve as a Topic Coordinator for "Computational Multimodal Sensing and Perception for Robotic Systems," a research topic featured in Frontiers in Robotics and AI.
2025-05-26: I gave a talk at the Institute for Regenerative Medicine and Biotherapy.
2025-04-11: I gave a talk at Inria Defrost Team.
2025-04-10: I serve as an Associate Editor for the 2025 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2025).
2024-11-19: Our recent work on self-decoupled robotic skin has been published in Nature Machine Intelligence and is highlighted in a News & Views article (also featured by CNRS Informatics).
2024-09-26: I gave a talk at Tsinghua Embodied AI Lab.
2024-07-19: I am co-organizing the RSS Workshop 2024 (Noosphere: Tactile Sensing for General Purpose Robot Learning).
2023-08-23: I will serve as an Associate Editor for the 2023 IEEE International Conference on Robotics and Biomimetics (ROBIO 2023).
2023-04-28: I was selected as an RSS Pioneer 2023.
2022-10-14: I have been given the Outstanding Research Thesis Award (8 awardees at CityU in 2022).
2022-06-07: I have successfully defended my PhD thesis.
2021-12-28: Our research project on soft magnetic skin has been showcased in the inaugural exhibition of HKU Innovation Wing Two.
2021-09-01: I have been awarded the Institutional Research Tuition Scholarship for 2021/9- 2022/8 at CityU.
2021-08-16: I have been given the Outstanding Academic Performance Award for 2020/9 - 2021/8 at CityU.
2021-03-10: Our recent work on soft robotic skin is published in Science Robotics, and featured by WIRED Magazine, CityU research stories, and other news outlets.
2024 Nature Machine Intelligence “A soft skin with self-decoupled three-axis force-sensing taxels”, Youcan Yan*, Ahmed Zermane, Jia Pan, Abderrahmane Kheddar*. (link)(SharedIt) (NMI News & Views)
2022 RA-L+ICRA “Tactile Super-resolution Model for Soft Magnetic Skin”, Youcan Yan, Yajing Shen, Chaoyang Song, Jia Pan*. (link)(video)
2021 Science Robotics “Soft Magnetic Skin for Super-resolution Tactile Sensing with Force Self-decoupling”, Youcan Yan, Zhe Hu, Zhengbao Yang, Wenzhen Yuan, Chaoyang Song, Jia Pan*, Yajing Shen*. (link) (video) (ESI Highly Cited Paper, featured by WIRED Magazine and other news outlets)
2021 RA-L+ICRA “Fast Localization and Segmentation of Tissue Abnormalities by Autonomous Robotic Palpation”, Youcan Yan, Jia Pan*. (link) (video)
2021 Advanced Intelligent Systems “Surface Texture Recognition by Deep Learning‐Enhanced Tactile Sensing”, Youcan Yan, Zhe Hu, Yajing Shen*, Jia Pan*. (link) (video)
2018 ICMME “A Finite Element Method of Modelling and Designing of an ACCC Conductor”, Youcan Yan, Yugang Duan *. (link)
We develop a theoretical super-resolution model for our soft magnetic skin, by which a 15-fold improvement of localization accuracy (from 6mm to 0.4mm) was achieved. Different from the existing super-resolution methods that rely on overlapping signals of neighboring taxels, our model only relies on the local information from a single taxel and thereby can detect multipoint contact applied on neighboring taxels and work properly even when some of the neighboring taxels near the contact position are damaged (or unavailable).
This video shows the online estimation of force location and magnitude for objects of different shapes and sizes using the proposed super-resolution model.
We design a soft tactile sensor that can measure both normal and shear forces in a decoupled way, and the sensor can locate the contact position at an accuracy finer than its physical resolution.
By mounting our sensor at the fingertip of a robotic gripper, we show that robots can accomplish challenging tasks such as stably grasping fragile objects under external disturbance and threading a needle via teleoperation.
We propose an approach that can simultaneously localize and segment the hard inclusions (artificial tumor) in artificial tissue via autonomous robotic palpation with a tactile sensor.
Our method is proven to be robust and efficient in both simulation and experiments, which provides new insight into fast tissue abnormalities detection during RMIS and could be beneficial to relevant surgical tasks like tumor removal.
We report a novel texture recognition method by designing a finger-shaped soft tactile sensor and a bi-directional LSTM (Long Short Term Memory) model with the attention mechanism, by which a respective recognition accuracy of 97% for Braille characters and 99% for 60 types of fabrics have been achieved, revealing the effectiveness of our method in surface texture recognition and the potential benefit to various applications, such as Braille reading for visually impaired people and defect detection in the textile industry.
The video shows the real-time recognition of the Braille poem Dreams with mounting our tactile sensor on a robot arm and human fingertip, respectively.
Co-organizer of RSS Pioneers Workshop 2024
Co-organizer of RSS 2024 Workshop (Noosphere: Tactile Sensing for General Purpose Robot Learning)
Mentor of Inclusion@RSS 2024
Associate Editor of ROBIO 2023, IROS 2025
Journal Reviewer: Nature Communications, T-RO, TMECH, ToH, TIM, RAM, RA-L, IJCAS, etc.
Conference Reviewer: RSS, ICRA, IROS, RoboSoft, AIM