Today
For Next Time
Slides linked at the bottom of this page.
I have included two scripts to help you get a better sense of how these things work in practice. Try running the following demos (make sure you have pulled from upstream first).
rosrun computer_vision_examples match_keypoints.py
First, make sure you understand what the visualization is showing. Next, characterize how the matches change as you move the sliders around. Note, that if you want to recompute matches after using the slider bars, you need to click on the main image window.
rosrun computer_vision_examples visualize_sift.py
This visualization is showing the SIFT descriptor that we just covered. The only thing that it doesn't do is rotate the descriptor relative to the dominant orientation. Draw in the left pane by clicking and dragging, make sure that you understand why the SIFT descriptor changes the way that it does. Note, that In order to reset the sketch, you need to hit the spacebar.
We've seen ROS bag a few times in this course. Handling images with ROS bag can be a bit tricky. You've already seen this in the context of the scaffolded mini-lab, but here is a step-by-step guide to using ROS bags with images.
To get started, connect to a Neato and then run this command
rosbag record -a
This will create a bag file with all of the topics currently being published. After about 10 seconds, hit control-c. You will see that you have a bag file on your computer that has a record of all of the ROS messages that were sent during that period. Do an ls -l
on your file to see the size. How much data is being use by this ROS bag? How much data per second?
The being able to store sensibly sized bag files is to only save compressed images from the robot. In order to do this, run the following command.
rosbag record -a -x "/camera/image_raw|/camera/image_raw/theora.*|/camera/image_raw/compressedDepth.*"
This command will tell rosbag to record all topics except ones that match the regular expression above. If you are unfamiliar with regular expressions, you should checkout these pages (wikipedia article on regular expressions syntax for using them in Python and for ROS bag?).
After about 30 seconds, hit control-c. Compare the size of this bag file with the one you got previously. How much data is record per second now?
Playing back your bag file
Before you play back your bag file you should disconnect from the robot. Remember, the whole point is to replay the robot's experience, not to overlay past robot sensory data on the present sensory data.
If you started roscore
as part of the neato bringup, you will need to restart roscore now.
In order to play back your bag file all you need to do is run:
rosbag play --clock path-to-bag-file-here
If you pop up rqt_gui, and add an image visualization, you should be able to see the image data from your bag file.
rosrun rqt_gui rqt_gui
Now all is well in good, except for the fact that you will not be able to process the compressed image data in your Python programs (there is support for this in C++, but no in Python). To get around this, you can run a special node that simply decompresses the image data for you.
rosrun image_transport republish in:=/camera/image_raw compressed out:=/camera/image_raw raw
That's it! Now you can run any nodes you want and you will be able to process the data just as if it were being broadcast live from the robot.