AdMaLL: Advanced 3d Mapping and Long-term Localisation with semantics understanding and terrain assessment for robot navigation in gps-denied environments

Dr. Kevin Li Sun

Principal Investigator

Dr.Jaime Pulido Fentanes

Co-PI (Industrial)

Dr. Zhi Yan

Advisory board

Marwan Taher

working on multi-sensor perception, mapping

Ahmed Abbas

working on robot localisation

Christopher Wild

working on terrain assessment

Jiahui Liu

working on anomalous object detection

Weihang Zhang

working on visual based navigation

Zhicheng Zhou

working on lidar-based global localisation

News - Sep 2021 - IROS presentation of V-T&R navigation paper

Paper: [PDF]

News-July 2021 - Field testing on West Wittering beach

We have tested our mapping, localization, and navigation system on Mars-like scene (The West Wittering beach).

News - April 2021 - Submap and MCL testing

In this video, the multi-sensor mapper and NDT-MCL localsiation are tested in Sheffield campus. Failure cases are detected and fixed. The system is ready for field test.


Quarterly Deliverable - Q4 [PDF]


News - February 2021 - Annual review video is released

Annual review demo videos are delivered to the FAIR-SPACE Hub.

News - January 2021 - Map-based relocalisation is integrated

We have integrated NDT-map representation and NDT-MCL (Monte-Carlo Localisation) with submap implementation with the Jaguar platform. The robot is able to re-localise (in 6DOF) within a previous-build map robustly and smoothly in large-scale unstructured environments.

Task 2.2, and the major part of 2.3 (NDT-map implementation, NDT-MCL integration) are completed. Note Semantic mapping in Task 2.3 is under development.

Quarterly Deliverable - Q3 [PDF]

News - November 2020 - Visual-based navigation stack is released.

We released the navigation stack for long-term visual navigation. Our approach is able to operate from-day-to-night using a single map and can repeat complex manoeuvres.

Github: https://github.com/kevinlisun/jaguar_nav

News - September 2020 - Multi-sensor mapping demo.

Task 1.1, 1.2, 2.1 (multi-sensor installation, calibration, multi-sensor mapping) are completed.

Quarterly Deliverable - Q2 [PDF]


News - September 2020 - Second field test and data collection

Visual-based navigation with various terrains. The second field test at Longshaw National Trust, Sheffield, UK.

News - August 2020 - First field test

Visual-based navigation in unstructured environments. The first field test at Tankersley, Sheffield, UK.

News - August 2020 - new demo of visual navigation

The low-level control and teleoperation is implemented. The visual navigation stack is integrated with ZED camera and Jaguar robot. Autonomous visual teach repeat (monocular and appearane-only) is demostrated in Romanwharf, Lincoln.


News - July 2020 - new publication

Our EU multi-sensor robocar dataset is accepted by IROS 2020.

The dataset and source code are released [Paper] [GitHub].

News - June 2020 - CAD model is finalised

The hardware design of the multi-sensor perception system is finalised. Our perception system consists of an Ouster64 lidar, a ZED2 stereo camera, Xsens 750 IMU.

Quarterly Deliverable - Q1 [PDF]

News - Jan 2020 - new publication

Our robot localisation research is presented in ICRA 2020.

The proposed approach is able to localise the robot using a singal lidar sensor within 2 seconds in average in a large-scale environment (0.5 km2). [Presentation]