Our project was able to accomplish a number of key benchmark tasks and we were able to determine a model of the vacuum gripper - rigid object system as a torsional spring.
We were able to manipulate the sawyer arm with the vacuum gripper attachment to the correct orientation in order to have the object achieve a goal orientation. We used a camera and AR Tags on a stationary background point and on the object itself for sensing. To achieve the goal orientation, we used move_it to compute a trajectory plan and move the sawyer’s end effector to a starting position, measured the error from goal using the AR tags and the sensing camera, and then iteratively updated the position of the end effector to achieve the desired orientation of the object using a proportional controller.
These images depict the relationship, as computed by our model, between the pose of the vacuum gripper and the pose of the rigid object grasped by the vacuum gripper. The pose of the vacuum gripper is given as an angle, θ_gripper, relative to the horizontal plane (horizontal reference pose) as depicted in Figure 1 and the pose of the object is given as an angle, θ_object, relative to the horizontal plane (horizontal reference pose) as depicted in Figure 2.
Demo #1a: Video of Working Closed Loop Controller Straightening the Object by Controlling the Sawyer Arm
The goal of this task was to use the model to determine the pose of the vacuum gripper -- the angle depicted as θ_gripper in Figure 3 -- such that the pose of the grasped object -- the angle depicted as θ_object in Figure 3 -- is 90 degrees or horizontal.
Demo #1b: Working closed loop feedback controller (RVIZ View)
Demo #2: Failure case for feedback controller. After controller finished we attempted to move the object onto a surface, but it failed due to a poor path plan.
We modeled the vacuum gripper as a torsional spring. We pretend the gripper acts like a torsional spring when picking up objects and can use torsional spring equations to determine the bend angle between the object and the gripper. A torsional spring has a spring constant, k, that can be used to determine the angle bend, etc. We collected angle data to learn this spring constant for our vacuum gripper.
We used a standard size object and calculated the wall-object angle and the object-robot angle for 15 degree increments of the gripper. The wall-object angle represents the angle between our stationary wall and the object. The object-robot angle represents the angle between our object and the vacuum gripper.
We began by converting all the angles in degrees to radians and then computed the k parameter using least squares.
We charted our results to see if they made sense/were accurate. The first figure shows our computed k-value compared to different gripper angles. The second figure shows the correlation between the gripper angle and the resulting angle of the object.
We were able to use the torsional spring model of the vacuum gripper to calculate the necessary pose of the vacuum gripper in order to achieve a goal orientation for the object being manipulated.
Demo #3: Video showing picking up small object. The angle between the object and gripper is not super significant because the object mass is pretty small (157g).
Demo #4: Video showing the vacuum gripper picking up the medium sized object (345g).
Demo #5: Video showing picking up the large object (457g)
Demo #6: If the object is too large then it will fall off the gripper.
We were able to determine the mass of objects with minimal error by leveraging our dynamics model of the vacuum gripper as a torsional spring and by measuring the angular displacement induced in the suction cup of the vacuum gripper using camera sensing.
Results of mass weighing experiment. Actual mass was 200g, but computed to be 168g.
Demo #7: Video showing experiment for measuring object of unknown mass.
This is the mass used in this experiment.