The project started during my master's studies at UT-Cluj. The aim was to fly with a quadcopter in a known environment autonomously. In my previous project, I used a low-cost robot. But in this project, I went with a ready-to-fly platform, the Parrot Ar.Drone 2.0. This drone is designed for mobile applications/games. However, it is also applicable for academic purposes.
Ar.drone had some specific features which were decisive for this project:
It has 2 cameras (facing front and bottom), 1 ultrasonic sensor, and WiFi connection
Low payload (it was not required for us)
The on-board microcontroller was easily programmable (it was not required for us)
We were using the Robot Operating System (ROS), which has drivers for the drone. I used the AutonomyLAb driver for the drone and openCV 2.7 for image processing. As a novice roboticist, it is easier to start with a simple task to get familiar with the system and use its functionalities. I created a small program that made the drone follow a symbol or tag. All the movement commands are implemented with tag detection onboard. So, the test program is just a way to get familiar with the AR.Drone driver. The Follower program can be found at the bottom of the page.
Flying in indoor & outdoor corridors:
Next, we wanted to fly autonomously on corridors. The drone detects low-dimensional features on the video feed in order to create a closed-loop control that guides the drone through the hallway. We used the vanishing point feature tracked with an Extended Kalman Filter (EKF). More about the EKF and vanishing point can be found on this tutorial page.
Outdoor railway track follower
We further developed the initial idea of using vanishing points for flight control. The vanishing point manifests when parallel lines are projected on a 2d image. Thus, the next obvious scenario is to apply it to rail tracks.
We compared different image pre-processing methods for edge detection, due to the increased noise on rail tracks compared to corridors. Moreover, we extended the control strategy with a PD controller over the yaw angle of the drone. The improved perception was tested both in simulated and real video feeds, while the controller was tested in simulation. We use Gazebo to test the controller on straight and curved rail tracks.
Use-case application of Environmental Constraint Exploitation for robotic surface treatment
Motion Generation With Contact-Based Environmental Constraints (Thesis link)
Human-like grasping from piles leveraging granular Environmental Constraints
Reactive motion planning with contact events
Assistive robotics with POMDP online solver
Inverted rotary pendulum controlled with an optimistic planning algorithm
Vision-based autonomous navigation for railway inspection with a UAV
UAV sensor noises and system identification
Pick and place Matlab application with a Melfa RV-2AJ arm
A Five-bar mechanism with Matlab UI and control for drawing