Research at College of Engineering (COE), China Agricultural University (CAU):
1. Hardware Design for Agricultural Robots
In our lab, we design and build various agricultural robots for different fields and tasks. Our robots serve on farms, in greenhouses, fruit orchards and livestock houses for precision spray, fetilization, weeding, monitering, and breeding tasks.
We design and build robots for local and international companies, research institutes and farmers.
2. Deep Learning based Robot Vision for Agricultural Robots
We focus on applying deep learning based algorithms for perception of plants, weeds and livestocks. We work on both implementation of novel methods and deployment of the algorithms on our robots. With the perception algorithms, our robots can detect, segment and track various plants and weeds, as well as judge if there is any failure when they are executing spray or weeding action.
3. Localization, Mapping and Autonomous Navigation for Agricultural Robots
We work on the Simultenous Localization and Mapping (SLAM) problem for agricultural robots in the challenging semi-structured agricutural environment using cameras and Lidars. Our robots navigate through farms and orchards autonomously based on GNSS and Lidar maps.
4. Robot Audition
We work on sound source localization and mapping problems using one or more microphone array(s).
Research at Australian Centre for Field Robotics (ACFR), The University of Sydney:
1. Intelligent Perception Systems for Agriculture and the Environment
Involved in several confidential industry funded projects. Details of projects are IP sensitive.
Some of the involved projects reported on media:
Research at Centre for Autonomous Systems (CAS), University of Technology Sydney (UTS):
1. Towards Real-Time 3D Sound Sources Mapping with Linear Microphone Arrays
Sound sources mapping using Kinect 360 and PS3 Eye sensor.
Supplementary video attachment for our paper: Daobilige Su, Teresa Vidal-Calleja and Jaime Valls Miro, "Towards Real-Time 3D Sound Sources Mapping with Linear Microphone Arrays", in 2017 IEEE International Conference on Robotics and Automation (ICRA 2017), 2017.
Open Source Implementation and Experimental Data is available on: https://github.com/daobilige-su/SSM_LinearArray
2. 3D sound sources localization using a handhold PS3 eye sensor
Supplementary video attachment for our paper: Daobilige Su, Teresa Vidal-Calleja and Jaime Valls Miro, "Split Conditional Independent Mapping for Sound Source Localisation with Inverse-Depth parametrisation", in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016).