Warmup

For the warmup project we will be implementing simple reactive behaviors on the Neato.

Learning Goals:

    • Gain familiarity with ROS
    • Brush up on Python
    • Learn about processing data from the laser range finder
    • Learn to program robot behaviors using reactive control strategies
    • Learn about finite-state robot control

Project Format:

Future projects will be done in teams, however, for the first project I am asking that every student turn in their own assignment. You will be turning in your project (both code + writeup) via Github. Later in the course you will have a lot of freedom to choose a particular project topic that interests you and your team, however, for this first project I want everyone to do something very similar to what I have outlined below. I have included several extensions to the basic project that I hope will keep students that are coming in with more background knowledge engaged.

The Project:

Your goal in this project will be to program the Neato to execute at least two of the three behaviors below. You are encouraged to be as creative as possible in this assignment. If you want to substitute another behavior for one of the following, just let me know! Also, for each of these behaviors there is a fairly straightforward way to implement the behavior and there is a more sophisticated way to implement the behavior. See the going beyond section for some more information on these more sophisticated approaches. You should be spending about nine hours on this assignment, so if you find yourself breezing through the required portions I recommend that you push yourself a bit further! The flip side is that if you find that you are stuck or having a difficult time making progress, please send me an e-mail so we can chat about what you are finding difficult.

Code Structure:

You code should be placed within a ROS package called warmup_project. If you want to structure your code with more than one package, make sure to document the additional packages in your project writeup.

Neato Lidar Diagram:

This diagram should help you with the project. It shows the angles for the laser range data coming from the Neato and how it maps onto the Neato's physical layout

Wall Following:

For this behavior your goal will be to pilot the Neato near a wall (e.g. using the teleoperation keyboard node... or just carry it!) and have the Neato move forward while aligning its direction of motion to be parallel to the nearest wall.

To get started let's draw a simple picture of the situation.

Building upon this simple picture, fill out what you can measure from your robot's sensors. What is the "goal" of your controller?

Some hints:

    • Draw lots of pictures. Make sure you understand the geometry of the problem.
    • A fairly straightforward way to attack the problem is by using proportional control. If you want to do something more sophisticated you may want to look into PID control (see going beyond section).
    • Sometimes various laser range measurements might not be present on every scan. In the diagram above I selected two specific laser measurements to make the problem easier, however, you should not limit yourself to just using these measurements. You will probably want to make your code robust by using multiple measurements (for redundancy).

Going beyond (some suggestions, but feel free to be creative):

    • Use a PID controller
    • Allow the user to specify a target distance between the robot center and the wall. Your code should follow the wall at the specified distance. You may find a finite state controller to be a useful way to attack this problem (where one state is wall following and the other is adjust distance to wall).
    • Handle 90 degree turns gracefully (either by continuing across the gap or following the turn and continuing to do wall following).
    • We will be learning about this later in the course, but if you want you can look into using OpenCV's Hough Transform to do wall detection.

Person Following:

Pretend your Neato is your robot pet and get it to follow you around! The intended behavior is that you can walk in front of the Neato and it will follow your movements while maintaining a specified following distance.

Hints:

    • One way to think about this problem is that the Neato is attempting to keep the closest large object in front of it at a specified distance and immediately in front of the robot.
    • As in wall following, you may find proportional control to be a useful strategy.
    • There are many ways to figure out where the person is. A simple approach is to calculate the center of mass of the laser measurements that fall within a prescribed box relative to the robot. This diagram should help clear things up:

Going Beyond:

    • The center of mass approach fails in a number of cases. One of those is when a large non-person object is within the person tracking region. Can you modify your code to handle this case? One strategy for handling this case is to follow moving objects within the person tracking region.

Obstacle Avoidance:

For this part you should program the Neato to move forward while reactively avoiding obstacles that block its path. One way to solve the problem is to think of a force constantly pulling the robot forward while nearby obstacles (as detected by the laser range finder) exert repellant forces on the robot. The magnitude of the repellant force should increase as the robot gets closer to the obstacle.

By summing the forces you can obtain a direction of motion for the robot (note: that the sum of forces is not shown in the diagram above). You can then use a proportional controller to steer towards this desired angle while still maintaining forward velocity.

Going beyond:

    • Instead of always trying to move forward, allow a goal to be specified in the robot's odometry coordinate frame (called odom). In order to best handle this, you will either want to listen to the /odom topic directly or else make use of coordinate transformations.

Combining Multiple Behaviors Using Finite-State Control:

Combine two or more of your behaviors together using a finite-state controller. You may find that drawing a state transition diagram is helpful. Each state should be a different behavior and each transition should be some condition that you can reliably detect in the environment. For instance, I might combine wall following with person tracking in the following way:

Going beyond:

    • Use the ROS smach package to code up your finite state controller (disclaimer: I haven't played around with this myself yet)

Turning in your Work:

Code: your code should be pushed to your Github repository. All code should be placed within a ROS package called warmup_project.

Writeup: in your ROS package create a file to hold your project writeup. Any format is fine (markdown, word, pdf, etc.). Your writeup should answer the following questions:

  • Which behaviors did you implement?
  • For each behavior, what strategy did you use to implement the behavior?
  • For the finite state controller, which behaviors did you combine and how did you detect when to transition between behaviors?
  • How did you structure your code?
  • What if any challenges did you face along the way?
  • What would you do to improve your project if you had more time?
  • Did you learn any interesting lessons for future robotic programming projects?