Project: Drone to the Rescue

Learning Objectives

The project is meant to let you integrate concepts learned throughout the semester on representing and tracking cyber and physical state, managing sensing noise, providing control commands, executing motion planning, applying transformations across coordinate systems, implementation leveraging ROS libraries and infrastructure, and testing and debugging with simulation support.

Important Due dates:

Regular submission: April 24th, 5PM

Early submission due date for extra credit: April 21st, 5PM.

This is an individual project. Sharing of code or design with teammates is not allowed.

Challenge

For this project we will design a drone module to help rescue a lost hiker. The hiker is lost and wandering aimlessly around the map.

Luckily, the hiker has a radio that continuously reports the location to a nearby tower (which happens to be the same tower from the previous Lab!), so our drone can approach the rescue area and rescue the hiker with our existing path planning capabilities.

However, rescuing the hiker brings additional challenges:

  • Our drone is only able to access the tower signal as long as it is flying above 20m. When it flies under 20m, the tower signal can no longer reach the drone. This is problematic for executing the rescue because a hiker can move a 2 meters every second. So, once you reach the target area you need to explore it following a search pattern (of your choice) while still avoiding existing obstacles.
  • The hiker will see the drone and stop moving when the drone flies within 5 meter of the hiker. Furthermore, when the hiker stops moving it will also ping the drone (through a line-of-sight only radio) with a message with its exact location (in world coordinates). Then the drone should simply descend to that location to execute the rescue.
  • The drone can only execute the rescue when there are at least 3 meters of free space in every direction. Attempting to land for a rescue under tighter conditions will not allow for a successful rescue. So your rescue needs to account for enough open space.

A successful rescue consists of landing within 0.5m from the hiker. The sooner you rescue the hiker, the better! Below is a diagram showing you the challenges.

Before getting started

We will be using an updated version of the labsim simulator. To get started update the simulator:

$ cd ~/labsim
$ git pull
$ catkin build

We will *not* be using the tower node but instead we will be using a new hiker node. So start by removing the tower node from the launch file and instead add the following line of code:

<node name="hiker_node" pkg="flightcontroller" type="hiker.py" output="screen" />

System Architecture

The architecture of the system is shown below. The blue boxes are nodes that have been given to you or nodes which you have completed in previous labs. The red boxes are nodes you will need to complete.

The system works as follows. A new hiker node publishes the hiker's position on the following topics:

  • tower/hiker/position: The hiker's position in the tower frame, when the drone is above 20m
  • hiker/exact/position: The hiker's position in the world frame, when the drone is within a 5m of the hiker


To start, you will need to slightly modify your tower_to_map node (labeled as "Tower Frame to World Frame" node in the graph above) to now subscribe to /tower/hiker/position and transform it from a tower frame to the world frame. Then publish the hiker's position in the world frame on the topic /hiker/position. Keep in mind that since the tower has not moved since we last saw it in Lab 8, the transformation should remain the same, only the messages passing through changed.


To rescue the hiker, you need to develop a specialized rescue node. Some of the topics at your disposal are:

  • /uav/sensors/gps : The drones position in the world frame.
  • /hiker/position: The hiker's position in the world frame when the drone is above 20m.
  • /hiker/exact/position: The hiker's position in the world frame when the drone is within a 5m sphere of the drone, i.e. euclidian distance in the x,y and z plane.
  • /map: The map you are working in.

The rescue node has the following topics which it can publish to:

  • /uav/input/goal: A goal position the planner will compute a trajectory to using your A* implementation.
  • /cancel/trajectory: This topic will cancel the current trajectory. This could come handy when, for example, the trajectory of the drone and the hiker diverge and you need to stop the current trajectory and identify a new one.
  • /uav/input/position: A position which the drone will move to in a straight line using the position PID controller. Remember, this will fly directly to the position and not follow any trajectory that avoids obstacles.


To facilitate the rescue, we have made some changes to the planner and the visualizer nodes.

Enhanced planner. If the planner node gets a new goal, it will replan the trajectory from the current drone position to the new goal. We have also added the topic /cancel/trajectory that allows you to cancel the current trajectory by publishing an Empty message. The final change is that the planner will now perform the trajectory at the height set by the Z component of the topic /uav/input/goal(planning is still over the xy plane but at the specified altitude).

Enhanced visualizer. The visualizer node now subscribes to the /hiker/position and the /hiker/exact/position. When the tower_to_map node publishes the /hiker/position (world frame), the visualizer will display the hiker as a black dot on the map. When the drone moves below 20m and loses signal to the tower, the visualizer will display the last known hiker position as a black dot. When the drone is within 5m of the hiker, and the hiker starts publishing on the topic /hiker/exact/position, the visualizer will display the exact position of the hiker using a blue dot.

Tips:

  • The rescue.py node is all yours. There are many different strategies to succeed with the rescue, just make sure to define that strategy before you start coding as you may not need all the topics at your disposal, or may need them selectively.
  • Define a state machine to represent the mission stages (approximation, exploration, rescue, ...) to decompose the system functionality. This will help you manage the complexity of the rescue.
  • Reuse as much of the existing code-based as you can, that is why we packaged it into nodes, classes, and functions.
  • Be selective on the use of the planner. Planning a trajectory is expensive. If the goal is updated too often, you will find that your drone will spend more time replanning the trajectory than actually following it.
  • Think about what coverage pattern gives you the best change to catch up with the hiker sooner and also think about when you should give up the search and start over by reconnecting with the tower.

Delivery and Assessment

Upload into Collab 2 items:

A. Recorded 4 Min Video (in mp4 format, under 40MB) by Friday April 24th 5PM (unless going for the extra credit -- see below). The video must cover the following items, times are approximate, and the rest of the format medium is up to you.

1) (5 seconds) Introductions

2) (60 seconds) Explain the overall structure of your rescue.py node

3) (30 Seconds) Explain your selected search pattern using a diagram and other supporting material

4) (120 Seconds) Showcase your approach by recording life sessions of the drone rescuing the hiker on three custom maps using:

i) easy one with few obstacles,

ii) medium one with sparse obstacles that impede direct access to hiker,

iii) hard one with many obstacles and a few landing spots but still feasible paths to the hiker

5) (25 seconds) describe the most exciting and hardest part of the project

(Video formatting Tip: Dylan recommends using https://handbrake.fr/ to optimize video size using Very Fast 720p30 (1280x720 resolution, 30 frames per second)

B. All your code in the /src directory and maps (as zipped file)


Brief rubric guideline for A and B

  • ~10% for submitting the required materials
  • ~10% points for the quality of explanations of the rescue node structure and the code structure itself
  • ~10% points for the explanation and the sophistication of the search pattern
  • ~50% points for providing custom maps that abide to the given constraints and for handling those maps successfully
  • ~10% points for identifying fun and hard parts of the project
  • If code and demo do not match, no credit will be given
  • Late submissions will be accepted for 50% of the project grade until April 26th 5PM

Extra Credit

To be considered for a 2% extra credit, upload your presentation to Collab by Tuesday April 21st at 5PM.

We will select the best 10 assignments for the extra credit and request that you showcase it live for the class in a Zoom session on April 23rd at 2PM-CST (our regular class time). We will let everyone attending the Zoom session vote for the best in the class to be inducted in the class World of Fame :)

Historical and non-relevant side note unless you enjoy retro video games

The problem statement is inspired in part by a game call "Choplifter" that I used to play in the Atari 6000 a few years ago that looked something like this (not retro, just plain old)