The TurtleBot was able to successfully carry an object from its initial position to the Sawyer robot and orient itself in the proper orientation for grasping. The Sawyer then successfully grasped the object and placed it in the destination box, thus completing the delivery and sorting task.
We used Simultaneous Localization and Mapping (SLAM) during the delivery trajectory and ensured that the TurtleBot could navigate to position the object within reach of the Sawyer’s possible range of motion while avoiding objects.
We located and tracked the object using a USB camera and AR Tags after calibrating the static transforms between the robot arm and its target objects using static transformations to allow for a more accurate pick and place. The robot arm also successfully avoids all known objects and boundaries encoded into its system.
We met the project goals we defined but were unable to meet our extensions due to time constraints on the robot. We intended the arm to sort multiple blocks and generalize our solution to more difficult problems, but due to limited Sawyer availability we were unable to fully develop and test extensions to our base project.
Because we were using a off-robot USB camera, the position and the orientation of the AR tags relative to the robot base were off by various noisy amounts. Thus, we had to calibrate and apply various static transforms to each AR tag location. One potential reason for this could be that the robot limb did not register the gripper length in its position calibrations. The x and y offsets were usually very small and required only slight tuning.
Terminal with RViz Sawyer model and USB camera (TF axes on AR tags) exhibiting wall avoidance capability and AR Tag localization.
TurtleBot SLAM-generated map (with mounted Kinect point cloud cost map)
Example Pick and Place configuration with Sawyer/TurtleBot