In this lab you will implement an Extended Kalman Filter to localize your robot within a global coordinate frame. You will leverage a top down view camera as your main source of exteroceptive sensing.
The goals of lab 03 are to design an Extended Kalman Filter and use it for online and offline robot localization experiments. The corresponding lab 03 subgoals are listed below:
Top layer addition
Camera setup
EKF Design
Offline localization experiments
Online localization experiments
There are three deliverables for lab 03 which must be submitted on Brightspace.
Lab 03 Video - that documents at least 2 different online localization experiments.
Lab 03 Report - that documents all four subgoals.
Lab 03 Code - your extended_kalman_filter.py file.
The new base code can be found here on our repo on GitHub. There are several key changes to the basecode.
robot_gui.py - The code now brings in a video stream. The video feed will be displayed in your GUI. You can comment out the display part if it runs slow on your computer. Keep it at the beginning for debugging.
extended_kalman_filter.py - This file is where you will add most of your code. Functions are created for you to fill in. Add more functions as desired. At the bottom you will notice a function for offline EKF that loads in your data and iterates over it for state estimation.
calibration_step_1.py - This file can be used for obtaining images for camera calibration. Change it as you see fit after reading the OpenCV tutorials.
calibration_step_2.py - This file can be used for obtaining camera calibration parameters. Change it as you see fit after reading the OpenCV tutorials.
robot.py - This file was added purely to remove some circular dependencies. The Robot class was moved to this file from robot_python_code.py.
data_handling.py - There are some additional plotting and data loading functions.
You need to add the top layer of the robot. Your wonderful TA has cut and drilled them for you. 1.
Step 1: Pick up your top layer, and 6 hex risers from Prof. Clark. They should be ready Friday, Feb 13th. Unfortunately MakerSpace time conflicts slowed us down.
Step 2: Remove the six screws that hold on the second layer of your robot to the existing brass risers. Replace the 6 screws with the new risers from Prof. Clark. Now use the screws to attach the top layer to new risers. See image below. Note, you won't get the lidar until the next lab.
For this lab, you will need to use your web cam to provide global position measurements of your robot.
Step 1: Read up on camera calibration. The OpenCV write up isn't too bad. See link here.
Step 2: Find a black and white checker pattern online and print a copy of it. Make sure it is at least 8.5 x 11 inches. Measure the actual dimensions of each square. Run the OpenCV calibration routine for your webcam. Note the camera_matrix and distance_coefficients. You will need those for step 3.
Step 3: Read up on Aruco tags. The OpenCV write up is linked here. Print one tag and put it on the top of your robot. The Aruco tags need the camera calibration parameters camera_matrix and distance_coefficients from step 2. You can update the parameters in your lab 3 code parameters.py file. The ones in the repo are for another camera model.
Step 4: Then run the robot_gui.py file. Make sure you can see the (distorted) camera view in the gui. If not, check the camera_ID parameter in the parameters.py file is set to your camera (e.g. 0,1,2). Your Aruco measurements should be printed to the terminal in units of cm. Point your camera at the top of your robot and watch the values change. Ensure they make sense. Use a ruler. You may want to adjust your camera matrix elements by hand (it improved things greatly for the Professor).
Step 5: Characterize the camera sensor measurements - i.e. place the camera at a height that you want to use for collecting data for EKF experiments. Place the robot in minimum 5 locations and measure the actual position and orientations. Compare with those outputted from the Aruco tag measurements. From this data, extract your camera x, y, theta variance parameter(s) that you will need for your Extended Kalman Filter correction step, i.e. to put in your Q_t matrix.
For this section of work, you will design and implement your EKF to estimate states x, y, theta of your robot.
Step 1: PREDICTION - Leveraging the motion model work from Lab 2, design a Prediction step for your Extended Kalman Filter. Specifically decide on the following:
Control input - What is your u_t vector?
Transition Function - What is your g(x_tm1, u_t) function?
Jacobian G_x,t - Derive what all of your partial derivatives should be.
Jacobian G_u,t - Derive what all of your partial derivatives should be.
Covariance R_t - Determine what each element of this matrix should be. Are all elements constants? Functions of other variables?
Step 2: CORRECTION - Leveraging the motion model work from Lab 2, design a Correction step for your Kalman Filter. Specifically decide on the following:
Measurement - What is your z_t vector?
Measurement Function - What is your h(x_t) function?
Jacobian H_x,t - Derive what all of your partial derivatives should be.
Covariance Q_t - Determine what each element of this matrix should be. Are all elements constants? Functions of other variables?
Step 3: IMPLEMENTATION - Now implement your EKF prediction and correction steps into your fork of the lab 3 repo code. There is a file extended_kalman_filter.py. You will edit this file. Try to use the existing function names so TAs and Professors can look at your code quickly when helping out. Feel free to add more functions and variables.
Now you can test your EKF filter with real data logged with the robot.
Step 1: Pick an environment. Determine how much space you need, what floor type, how you will mount your camera. Determine how much space will be covered by trajectories. Note that the robot path may leave the overhead camera view for some trials (and only the prediction step should kick in).
Step 2: Design a "simple" trajectory for preliminary testing that makes it easy to debug your filter, and one or more for more complex testing.
Step 3: Drive the "simple" trajectory, being sure to log the robot data as well as some form of truth measurements (think about this, there are several ways to accomplish this - none of them perfect).
Step 4: Run a script (e.g. bottom function of the extended_kalman_filter.py file) that loads and runs your EKF over the data file. First, set your EKF to run and test only your prediction step of the filter, (think you would modify the code so corrections never happen). Do state estimates look good? Can you see confidence ellipses grow? Add in your correction step. Does performance improve? Note that it might be better to visulize a scaled up confidence ellipse so its easy to see even when covariance matrix elements are small.
Step 5: Test your code against these conditions:
Unknown start vs known start: Try the EKF against one of your data files, but modify the hard coded "known" start pose to be at different poses in the workspace. Be sure your initial covariance matrix for the state estimate accommodates the distance between the initial guess and the actual pose of the robot. A good filter will be robust to bad guesses of the initial pose.
In-Frame vs Out-of-Frame: Have parts of your robot paths move in and out of the overhead camera's view. Outside of the camera frame, correction steps should be infeasible. What happens to the covariance matrix associated with the state estimate during those times?
Step 6: Plot your trajectories to be included in your report. Be sure to plot estimated and true states when possible. Include confidence ellipses. Error plots may also be useful. Be able to desribe your EKF performance using these plots in your report.
Now that you are confident your EKF is working, lets run it in real time.
Step 1: Start with the robot in view of the camera. The EKF state estimate of the robot (and confidence ellipse) should be visible on the central pane of the GUI. Drive the robot around and be sure the state update updates properly on the GUI. Note that that Robot class has a member self.extended_kalman_filter which should call your code and get an update call once per robot control cycle.
Step 2: Design a trajectory to document that pushes the limit of your EKF, and can be documented in both video and report plots. Drive the trajectory and log state estimates, truth data, 3rd person video, and screen capture video of your GUI (e.g. with QuickTime).
Step 3: Create a picture-on-picture video that shows both your 3rd person video and the GUI screen captured video. iMovie might work. You will upload this to Brightspace as a deliverable.
Write a formal report, using IEEE format as mentioned on the website lab schedule page. For this lab, be sure to include all plots mentioned above, and more plots if you think they are significant. Assume the reader knows about robotics, but that they are not familiar with our class. Sections should include:
Abstract - The section should provide an overview of the work. There should be a minimum of one sentence that informs the reader about the motivation, method, experiment, and results. Be sure to provide at least two significant quantifiable results from your results that a reader may find interesting.
Introduction - Use at least one paragraph to describe the motivation for the EKF. Use a paragraph to provide an overview of each section of the paper. E.g. “The method section will describe mathematical details of the motion model developed, after which the experimental design section will detail how the model was validated.”.
Method - Describe your EKF design with mathematical equations. Start with what is given to you - i.e. introduce the robot (with an image) and discuss the inputs to your EKF. All design decisions from section 2 should be included. Be sure to define all new variables in text. Equations should not have words, just greek letter variables with numbers and letter subscripts. Number all equations.
Experiment design - Use images, photos, figures to explain the experimental setup and how physical measurements were taken for testing the EKF.
Experimental results - Show all your plots here. Discuss assumptions problems, successes, etc. Label all plots. Each plot should be referred to at least once in your text discussions.
Conclusion - Present a high level understanding of the performance of your EKF when estimating robot poses. Highlight low and high performance aspects of your robot/EKF system as a whole. Quantify performance claims.
References - Cite key references. You may want to do a little research to cite key textbooks, EKF implementations from the past.