Welcome to my webpage. I'm a Senior Research Fellow working in Computer Vision at the Queensland University of Technology (QUT). I have a passion for bridging the gap between computer vision/machine learning research and their application to real-world problems.

I'm affiliated with the ARC Centre of Excellence for Robotic Vision (ACRV) and work in the area of Applied Image Classification. I am currently leading QUT's involvement in a Horticulture Innovation Australia project on "Vision Systems, sensing and sensor networks to manage risks and increase productivity in vegetable production systems" and previously worked on the Strategic Investment in Farm Robotics (SIFR) project. These two projects examine ways that robotic vision can automate detection and identification of plants and crops for Agriculture. Other application areas include automation of environmental monitoring and industrial applications.

Some ares of particular interest for me are the use deep learning methods to learn features (global and local), efficient deep learning approaches, local feature modelling and session variability modelling.


Our paper on "Automating Analysis of Vegetation with Computer Vision: Cover Estimates and Classification" has been accepted to Ecology and Evolution!! Congratulations to everyone involved!

We will be organising a workshop at ICRA 2018, on Robotic Vision and Action in Agriculture with submissions for papers closing soon.

We have a new book chapter and two new journal papers!
  • Overview of a Mechatronic Design for a Weed-Management Robotic System (Robotics and Mechatronics for Agriculture, CRC Press)
  • A Rapidly Deployable Classification System using Visual Data for the Application of Precision Weed Management (Computers and Electronics in Agriculture)
  • Efficacy of Mechanical Weeding Tools: a study into alternative weed management strategies enabled by robots (IEEE RA-L)
Our Journal paper for AgBot II "Robot for Weed Species Plant-Specific Management" was recently accepted to the Journal of Field Robotics!

Our paper "Peduncle Detection of Sweet Pepper for Autonomous Crop Harvesting - Combined Colour and 3D Information" was a finalist for the Best Automation Paper award at ICRA 2017. Congratulations to everyone involved especially Dr Inkyu Sa and Dr Chris Lehnert!

Our group has 4 papers accepted to ICRA 2017! We have 3 RA-L and 1 regular paper:
  • Mixtures of Lightweight Deep Convolutional Neural Networks: applied to agricultural robotics (IEEE RA-L)
  • Peduncle Detection of Sweet Pepper for Autonomous Crop Harvesting - Combined Colour and 3D Information (IEEE RA-L)
  • Autonomous Sweet Pepper Harvesting for Protected Cropping Systems (IEEE RA-L)
  • Towards Unsupervised Weed Scouting for Agricultural Robotics
Best Paper Award at DICTA 2016 for "Exploiting Temporal Information for DCNN-Based Fine-Grained Object Classification"! Congratulations to everyone involved, especially ZongYuan Ge! The paper can be found here.

"DeepFruits: a fruit detection system using deep neural networks" has been published (August, 2016), a version of the paper can be found here.

Our paper "Sweet Pepper Pose Detection and Grasping for Automated Crop Harvesting" was a finalist for the 2016 ICRA Best Automation Paper Award. Congratulations to everyone involved! We also presented "Visual Detection of Occluded Crop: for automated harvesting" at ICRA 2016 in Stockholm, Sweden. You can see an example of this system working on real farm data here.

Code for mixtures of deep convolutional neural networks (MixDCNN) is available on github here.