The project is meant to let you integrate concepts learned throughout the semester on representing and tracking cyber and physical state, managing sensing noise, providing control commands, executing motion planning, applying transformations across coordinate systems, implementation leveraging ROS libraries and infrastructure, and testing and debugging with simulation support.
Regular submission: April 24th, 5PM
Early submission due date for extra credit: April 21st, 5PM.
This is an individual project. Sharing of code or design with teammates is not allowed.
For this project we will design a drone module to help rescue a lost hiker. The hiker is lost and wandering aimlessly around the map.
Luckily, the hiker has a radio that continuously reports the location to a nearby tower (which happens to be the same tower from the previous Lab!), so our drone can approach the rescue area and rescue the hiker with our existing path planning capabilities.
However, rescuing the hiker brings additional challenges:
A successful rescue consists of landing within 0.5m from the hiker. The sooner you rescue the hiker, the better! Below is a diagram showing you the challenges.
We will be using an updated version of the labsim simulator. To get started update the simulator:
$ cd ~/labsim
$ git pull
$ catkin build
We will *not* be using the tower node but instead we will be using a new hiker node. So start by removing the tower node from the launch file and instead add the following line of code:
<node name="hiker_node" pkg="flightcontroller" type="hiker.py" output="screen" />
The architecture of the system is shown below. The blue boxes are nodes that have been given to you or nodes which you have completed in previous labs. The red boxes are nodes you will need to complete.
The system works as follows. A new hiker node publishes the hiker's position on the following topics:
tower/hiker/position
: The hiker's position in the tower frame, when the drone is above 20mhiker/exact/position
: The hiker's position in the world frame, when the drone is within a 5m of the hikerTo start, you will need to slightly modify your tower_to_map
node (labeled as "Tower Frame to World Frame" node in the graph above) to now subscribe to /tower/hiker/position
and transform it from a tower frame to the world frame. Then publish the hiker's position in the world frame on the topic /hiker/position
. Keep in mind that since the tower has not moved since we last saw it in Lab 8, the transformation should remain the same, only the messages passing through changed.
To rescue the hiker, you need to develop a specialized rescue node. Some of the topics at your disposal are:
/uav/sensors/gps
: The drones position in the world frame./hiker/position
: The hiker's position in the world frame when the drone is above 20m./hiker/exact/position
: The hiker's position in the world frame when the drone is within a 5m sphere of the drone, i.e. euclidian distance in the x,y and z plane./map
: The map you are working in.The rescue node has the following topics which it can publish to:
/uav/input/goal
: A goal position the planner will compute a trajectory to using your A* implementation./cancel/trajectory
: This topic will cancel the current trajectory. This could come handy when, for example, the trajectory of the drone and the hiker diverge and you need to stop the current trajectory and identify a new one. /uav/input/position
: A position which the drone will move to in a straight line using the position PID controller. Remember, this will fly directly to the position and not follow any trajectory that avoids obstacles. To facilitate the rescue, we have made some changes to the planner and the visualizer nodes.
Enhanced planner. If the planner node gets a new goal, it will replan the trajectory from the current drone position to the new goal. We have also added the topic /cancel/trajectory
that allows you to cancel the current trajectory by publishing an Empty message. The final change is that the planner will now perform the trajectory at the height set by the Z component of the topic /uav/input/goal
(planning is still over the xy plane but at the specified altitude).
Enhanced visualizer. The visualizer node now subscribes to the /hiker/position
and the /hiker/exact/position
. When the tower_to_map
node publishes the /hiker/position
(world frame), the visualizer will display the hiker as a black dot on the map. When the drone moves below 20m and loses signal to the tower, the visualizer will display the last known hiker position as a black dot. When the drone is within 5m of the hiker, and the hiker starts publishing on the topic /hiker/exact/position
, the visualizer will display the exact position of the hiker using a blue dot.
rescue.py
node is all yours. There are many different strategies to succeed with the rescue, just make sure to define that strategy before you start coding as you may not need all the topics at your disposal, or may need them selectively. A. Recorded 4 Min Video (in mp4 format, under 40MB) by Friday April 24th 5PM (unless going for the extra credit -- see below). The video must cover the following items, times are approximate, and the rest of the format medium is up to you.
1) (5 seconds) Introductions
2) (60 seconds) Explain the overall structure of your rescue.py node
3) (30 Seconds) Explain your selected search pattern using a diagram and other supporting material
4) (120 Seconds) Showcase your approach by recording life sessions of the drone rescuing the hiker on three custom maps using:
i) easy one with few obstacles,
ii) medium one with sparse obstacles that impede direct access to hiker,
iii) hard one with many obstacles and a few landing spots but still feasible paths to the hiker
5) (25 seconds) describe the most exciting and hardest part of the project
(Video formatting Tip: Dylan recommends using https://handbrake.fr/ to optimize video size using Very Fast 720p30 (1280x720 resolution, 30 frames per second)
B. All your code in the /src directory and maps (as zipped file)
Brief rubric guideline for A and B
To be considered for a 2% extra credit, upload your presentation to Collab by Tuesday April 21st at 5PM.
We will select the best 10 assignments for the extra credit and request that you showcase it live for the class in a Zoom session on April 23rd at 2PM-CST (our regular class time). We will let everyone attending the Zoom session vote for the best in the class to be inducted in the class World of Fame :)
The problem statement is inspired in part by a game call "Choplifter" that I used to play in the Atari 6000 a few years ago that looked something like this (not retro, just plain old)