Logitech C922 and Tripod
Webcam at 1080p with 30fps used to determine block colors.
Colored Blocks
The colored blocks (ROYGBV) represent the numbers by which Sawyer sorts.
Sawyer Arm and Gripper
We used a 7-DOF Sawyer arm and gripper from Rethink Robotics to sort the colored blocks.
AR Tags
We printed AR tags for position tracking of sorted , temporarily stored, and unsorted blocks.
The system starts with input data consisting of the initial block order for color classification and initial AR tags for localization. Via Computer Vision (CV) processing, we conduct block color classification and AR tag localization. We use a sorting algorithm (Selection Sort, Insertion Sort, Merge Sort) to determine the order in which the blocks need to be rearranged based on their colors. Map (Idx in color order: AR (x, y)): We map each block color index to its corresponding AR tag coordinates by combining the information from color classification and AR tag localization. The Linear Path Planner receives high-level movement commands based on the block mapping. It plans the path from the starting index to the ending index with specified velocities. We wrote a PID Controller, and the system loops through the path planning and PID control steps until the blocks are accurately sorted by color. Each completed cycle results in the blocks sorted by color in ROYGBV order!
It sorts an array by repeatedly selecting the smallest element from the unsorted portion and swapping it with the first unsorted element. This process continues until the entire array is sorted.
It iteratively inserts each element of an unsorted list into its correct position in a sorted portion of the list.
It uses a divide & conquer approach by dividing the array into two halves, sorting each half, and merging the sorted halves back together. It works by recursively dividing the input array into smaller subarrays and sorting those subarrays then merging them back together to obtain the sorted array.
1. Detect AR tags using Sawyer hand camera
2. Logitech camera captures color image (Sawyer camera is grayscale)
3. For each color, mask out all pixels outside of tolerance range
4. Use contour detection from OpenCV and only accept regions such that the area is greater than a threshold
decrease/increase tolerance from step 2 until only one block found
5. After one of each color is detected, sort by y-coordinate to get initial color ordering → associate this with detected AR tags
Masks and corresponding thresholded block classifications generated using our CV processing:
1. Given the positions of AR tags and order of blocks from CV.
2. Use specified sorting algorithm to determine different locations (x, y, z) for robot to move to. Returns high-level list of movements: Entry(start_idx, end_idx, level).
Each entry will be broken into the following six movements:
(i) Move above start_idx (ii) Move down to block + close gripper (iii) Move back up (iv) Move to above end_idx (v) Move down, open gripper, (vi) Move back up.
3. Map each idx to (x,y) pos relative to base of Sawyer. Map(entry_idxs: (ar_pos_x, ar_pos_y).
Loop:
4. Use Sawyer inverse kinematics solver to get joint-space coordinates.
5. Generate waypoints (in joint-space) for a linear trajectory.
Originally, we used the MoveIt (path planner + controller), which led to wild and unpredictable trajectories.
6. We use a tuned PID controller to navigate to each of these waypoints in joint space.
This allowed us to achieve smooth movements.
Recalibration and Color Thresholding Lighting variations, shadows, camera resolution, and glare originally distorted color thresholding parameters, and CV system required recalibration for each robot if conditions changed
Jerky Movements We implemented linear trajectories, added waypoints to smooth out movement, decreased overall speed via adjusting total time, and implemented PID Control by adjusting K_p to decrease responsiveness, K_d to dampen oscillations, and K_i to decrease steady state error.
AR Tag Localization We implemented a sweeping function, but then simplified design to bring Sawyer camera closer to AR tags + used a “temp” AR tag. To the right is an example of Sawyer performing on a small number of blocks (before we increased the number of blocks).