Last updated on: December 05, 2020 | Completed up to Stage 1
To incorporate autonomy into our UAV, we have laid the following milestones for ourselves to achieve. These milestones are to be achieved first on our simulation test-bed and then on the hardware. The stages in non-bold font are achieved by the team.
Manual control (Stage 0)
SLAM Z (Stage 1)
Hover at a set height (Stage 1.1)
Hover with a set heading (Stage 1.2)
Linear-Z mapping (Stage 1.3)
Camera Calibration (Stage 1.3.0)
Color threshold estimation tool (Stage 1.3.1)
Color identification (Stage 1.3.2)
Rotation-Z mapping (Stage 1.4)
Z mapping (Stage 1.5)
SLAM X (Stage 2)
SLAM Y (Stage 3)
Complete SLAM (Stage 4)
Adding thermal imaging camera (Stage 5)
Side notes:
In SLAM Z , the vehicle moves linearly along the local Z-axis and rotates about the Z-axis. With these two directions of freedom, the vehicle must map the landmarks in the environment around it. Once the mapping is done, the vehicle will land at the origin and user will then ask the vehicle to hover at a particular landmark and the vehicle is expected to use the map and generate set-points to reach the goal in the most efficient manner.
To implement Stage 1.3, we have the constructed a stack of cylinders around the vehicle in the following manner:
Cylinder stack for linear Z mapping (Stage 1.3)
Cylinder stack for linear + rotational Z mapping (Stage 1.4 and 1.5)
2. Similarly, SLAM X and SLAM Y means implementation of SLAM algorithm by giving freedom to the vehicle to move along and about the X and Y axis respectively.
3. We have also begun building the basic hardware structure.
4. For implementing the SLAM algorithm, we are using the g2o framework for implementing a graph-based SLAM approach.
5. A simpler optimization algorithm is implemented to optimize the graph obtained: DBSCAN