Custom 3D-Printed Parts (and Number of Pieces):
base.STL (1)
bearing_bottom.STL (1)
bearing_holder.STL (2)
bearing_top.STL (1)
shoulder1.STL (1)
shoulder2.STL (1)
shoulder3.STL (1)
sleeve1.STL (6)
sleeve2.STL (2)
elbow1.STL (1)
elbow2.STL (1)
wrist1.STL (1)
wrist2.STL (1)
finger1.STL (1)
finger2.STL (1)
camera_stand.STL (1)
Standard Hardware (and Number of Pieces):
SG90 Servo Motor (2)
Phone Stand Members (4)
Long members from this product
Phone Stand Springs (4)
Springs from this product
DS3218 Servo Horn (1)
Metal servo horn from this product
MG996R Servo Horn #1 (1)
Straight servo horn from this product
MG996R Servo Horn #2 [to be used on DS3218 Motor] (1)
Cross-shaped servo horn from this product
SG90 Servo Horn (2)
Straight servo horn from this product
10 mm M3 bolts (7)
30 mm M3 screws (3)
35 mm M3 screws (6)
40 mm M3 bolts (4)
M3 nuts (11)
M3 washers (2)
10 mm M4 bolts (4)
20 mm M4 bolts (18)
30 mm M4 bolts (6)
M4 nuts (28)
50 mm M6 bolts (4)
70 mm M6 bolts (2)
M6 nuts (6)
SG90 servo screws (4)
Larger screw from this product
Base
Fit a DS3218MG servo motor into base1 custom 3D-printed piece. Secure with four 10mm M4 screws and four M4 nuts. Clamp base1 to a stable surface using any clamp.
Attach MG996R Servo Horn #2 to the DS3218 motor. Line up bearing_bottom to the servo horn.
Place a 6205ZZ ball bearing in between two bearing_holder parts. Secure to base part using six 30 mm M4 bolts and six M4 nuts.
Line up bearing_top to bearing_bottom above the bearing_holder parts. Secure shoulder2 through bearing_top and bearing_bottom with four 40mm M3 screws.
Shoulder
Secure shoulder1 to shoulder2 using three 20mm M4 bolts and three M4 nuts. Secure shoulder3 to shoulder2, again using three 20mm M4 bolts and three M4 nuts.
Fit a DS3218MG servo motor into shoulder1 and secure with four 20mm M4 bolts and four M4 nuts.
Secure the DS3218MG servo horn to the motor and fit into place a sleeve2. Fit in a phone stand member into the sleeve. Place a 604ZZ ball bearing on the axis of the servo motor in shoulder3 and use washers as needed. Screw a 30mm M3 screw through the bearing and servo horn, into the servo motor.
Fit a second phone stand member into a sleeve1. Using washers as needed, secure the member and sleeve to the axis of shoulder1 and shoulder3 using a 50mm M6 bolt and an M6 nut.
Elbow
Place sleeve1's on the end of each phone stand member. Attach each to the axes of elbow1 and elbow2 parts using two 50mm M6 bolts and two M6 nuts.
Secure another DS3218MG servo motor and horn to elbow1 in the same manner as with the shoulder, placing a member into sleeve2 and securing the sleeve2 to the servo horn and securing them with a 30mm M3 screw and an M3 nut.
Fit a second member into a sleeve1 and secure them to the axis of elbow1 and elbow2 using a 50 mm M6 bolt and an M6 nut.
Wrist and Fingers
In the same way as with the elbow joint, attach sleeve1's on the end of each phone stand member and attach these to the axes of wrist1 and wrist2 parts using M6 bolts and nuts.
Secure a MG996R servo motor to wrist1 using four 20mm M4 bolts and four M4 nuts. Attach MG996R Servo Horn #1 to the motor, and secure finger1 to the horn. Secure this axis with 604ZZ ball bearing and a 30mm M3 screw.
Place a SG90 servo motor in the gap of finger1 and secure with two SG90 servo screws. Place an SG90 servo horn on the motor.
Fit finger2 onto the servo horn. Secure with an SG90 servo screw through finger2, and then place another SG90 servo motor in finger2. Place another SG90 servo horn on the motor.
Fit a camera_stand onto the servo horn and secure with an SG90 screw. Fit the Raspberry Pi camera onto the camera stand and secure with four M2 nylon screws.
The ROBOCAM experience begins with an operator running the robocam.launch file, which opens up a pre-configured RViz interface actively displaying the RobotModel and the current frame of the nonstationary subject being tracked. Within this RViz environment is enabled the custom Keyframe plugin, from which the operator will set Keyframes - goal poses and the timestamps to reach those poses at. To do so, the operator simply adjusts their current view to a desired pose, and presses the "K" key on their keyboard to place a Keyframe at that position and orientation. Once the Keyframe has been created, they may adjust the designated timestamp of the Keyframe, its name, or delete it all together if they so choose. After repeating this process until they are satisfied with their created sequence, they can check the Publish? box in the plugin's viewing pane to send that target to the robot, which then actuates with respect to the tracked subject while maintaining fidelity to the operator-defined Keyframes.
We used ROS Kinetic and RViz as our baseline tools for the ROBOCAM project. Provided below are code previews and detailed descriptions of various key elements of our custom packages and overall software structure.
#!/usr/bin/env python
import tf
import rospy
import pickle
import tf2_ros
import cv2, PIL
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
from cv2 import aruco
from cv_bridge import CvBridge
from geometry_msgs.msg import (
TransformStamped
)
from sensor_msgs.msg import (
Image
)
def track_aruco(message):
with open('./src/aruco_pkg/data/calib_mtx_dist.pkl', 'rb') as handle:
mtx, dist = pickle.load(handle)
bridge = CvBridge()
img = bridge.imgmsg_to_cv2(message, desired_encoding='passthrough')
h, w = img.shape[:2]
newcameramtx, roi=cv2.getOptimalNewCameraMatrix(mtx,dist,(w,h),1,(w,h))
# undistort
dst = cv2.undistort(img, mtx, dist, None, newcameramtx)
x,y,w,h = roi
frame = dst[y:y+h, x:x+w]
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
aruco_dict = aruco.Dictionary_get(aruco.DICT_6X6_250)
parameters = aruco.DetectorParameters_create()
corners, ids, rejectedImgPoints = aruco.detectMarkers(gray, aruco_dict, parameters=parameters)
frame_markers = aruco.drawDetectedMarkers(frame.copy(), corners, borderColor=(0, 0, 255))#ids)
if ids is not None and len(ids) > 0:
# Estimate the posture per each Aruco marker
rotation_vectors, translation_vectors, _objPoints = aruco.estimatePoseSingleMarkers(corners, 1, mtx, dist)
frame_markers = aruco.drawAxis(frame_markers, mtx, dist, rotation_vectors[0], translation_vectors[0], 1)
# we need a homogeneous matrix but OpenCV only gives us a 3x3 rotation matrix
rotation_matrix = np.array([[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 1]],
dtype=float)
rotation_matrix[:3, :3], _ = cv2.Rodrigues(rotation_vectors[0])
# convert the matrix to a quaternion
quaternion = tf.transformations.quaternion_from_matrix(rotation_matrix)
#broadcast frame
tf2Stamp = TransformStamped()
tf2Stamp.header.stamp = rospy.Time.now()
tf2Stamp.header.frame_id = 'link_camera'
tf2Stamp.child_frame_id = 'tracked_object'
tf2Stamp.transform.translation.x, tf2Stamp.transform.translation.y, tf2Stamp.transform.translation.z = translation_vectors[0][0]
tf2Stamp.transform.rotation.x, tf2Stamp.transform.rotation.y, tf2Stamp.transform.rotation.z, tf2Stamp.transform.rotation.w = quaternion
tf2Broadcast.sendTransform(tf2Stamp)
if __name__ == '__main__':
tf2Broadcast = tf2_ros.TransformBroadcaster()
rospy.init_node("obj_tracker")
sub = rospy.Subscriber("subject_image_raw", Image, track_aruco)
rospy.spin()
This script, run only after a simple checkerboard calibration for OpenCV has been run once, continually (30 FPS) broadcasts a TF transform between the Camera frame of the robot's end effector and a nonstationary target, which it tracks via an ArUco marker. It begins by converting an input ROS Image to an OpenCV image and then undistorts that image based on the pickled calibration matrix, generated in cameraCalibration.py. Then, the ArUco marker on the object is identified and an XYZ pose is extracted. From here, Rodrigues' rotation formula is used to convert that pose into a TF quaternion, which is then converted directly into a TF transform and broadcast.
#!/usr/bin/env python
import tf
import math
import PyKDL
import rospy
import rospkg
import numpy as np
import kdl_parser_py.urdf as parser
from urdf_parser_py.urdf import URDF
from tf_conversions import posemath
from tf.transformations import euler_from_quaternion
from geometry_msgs.msg import (
PoseStamped,
PointStamped
)
from sensor_msgs.msg import (
JointState
)
from ik_solver.srv import (
SolveIKSrv,
SolveIKSrvResponse
)
class Plotter:
def __init__(self):
self.publishers = []
self.toPublish = []
self.labels = []
self.goalState = JointState()
def addGoal(self, goalState):
self.publishers.append(rospy.Publisher("joint_states", JointState, queue_size=10))
self.toPublish.append(goalState)
self.labels.append("joint_states")
def addVector(self, vector, label):
point = PointStamped()
point.header.stamp = rospy.Time.now()
point.header.frame_id = "world"
point.point.x, point.point.y, point.point.z = vector
self.publishers.append(rospy.Publisher("pub_{}".format(label), PointStamped, queue_size=10))
self.toPublish.append(point)
self.labels.append("pub_" + label)
def publish(self):
print("Publishing...")
print("Labels: \n{}".format(self.labels))
rate = rospy.Rate(10)
while not rospy.is_shutdown():
for tp, publisher in zip(self.toPublish, self.publishers):
publisher.publish(tp)
rate.sleep()
def callback(req):
targetFrame = posemath.fromMsg(req.input_pose.pose) # Target pose as KDL frame
solved, jointState = solveIK(targetFrame)
ret = SolveIKSrvResponse()
ret.solved = solved
ret.output_joints = jointState
return ret
def solveIK(targetFrame):
ok, tree = parser.treeFromFile(rospkg.RosPack().get_path('rviz_animator') + "/models/robocam.xml")
chain = tree.getChain('base', 'link_camera')
plotter = Plotter()
# 1. Solve for J4, J5_initial, J6
# First, convert quaternion orientation to XZY order Euler angles
targetQuat = targetFrame.M.GetQuaternion() # Get quaternion from KDL frame (x, y, z, w)
pitch, yaw, roll = tf.transformations.euler_from_quaternion(targetQuat, axes='rxzy')
pitch_deg, yaw_deg, roll_deg = math.degrees(pitch), math.degrees(yaw), math.degrees(roll)
J4_origin_orientation = posemath.toMsg(chain.getSegment(2).getFrameToTip()).orientation
J4_offset = euler_from_quaternion([J4_origin_orientation.x, J4_origin_orientation.y, J4_origin_orientation.z, J4_origin_orientation.w])[0]
J4_raw, J5_initial, J6 = pitch, yaw, roll
J4 = J4_raw - J4_offset
# 1. Complete above
print("J4: {} J5_init: {} J6: {}".format(J4, J5_initial, J6))
chainAngles = PyKDL.JntArray(8)
chainAngles[5], chainAngles[6], chainAngles[7] = J4, J5_initial, J6
chainFK = PyKDL.ChainFkSolverPos_recursive(chain)
purpleFrame = PyKDL.Frame()
brownFrame = PyKDL.Frame()
purpleSuccess = chainFK.JntToCart(chainAngles, purpleFrame)
# print("Purple success {}".format(purpleSuccess))
print("Target Orientation:\n{}".format(targetFrame.M))
print("Result Orientation:\n{}".format(purpleFrame.M))
brownSuccess = chainFK.JntToCart(chainAngles, brownFrame, segmentNr=7)
# 2. Determine position of orange point
# First, construct KDL chain of the 3 links involved in J4-J6
cameraOffsetChain = tree.getChain('link_pitch', 'link_camera')
cameraJointAngles = PyKDL.JntArray(2)
cameraJointAngles[0], cameraJointAngles[1] = J5_initial, J6
cameraOffsetChainFK = PyKDL.ChainFkSolverPos_recursive(cameraOffsetChain)
cameraFrame = PyKDL.Frame()
success = cameraOffsetChainFK.JntToCart(cameraJointAngles, cameraFrame)
# print("FK Status: {}".format(success))
# print("Camera Frame: {}".format(cameraFrame))
# print("End Effector Joint Angles: {}".format([J4, J5_initial, J6]))
orangePoint = targetFrame.p - (purpleFrame.p - brownFrame.p)
plotter.addVector(targetFrame.p, "pink")
plotter.addVector(orangePoint, "orange")
plotter.addVector(purpleFrame.p, "purple")
plotter.addVector(brownFrame.p, "brown")
# print("Target Frame Position: {}".format(targetFrame.p))
# print("Camera Frame Position: {}".format(cameraFrame.p))
# print("Offset: {}".format(targetFrame.p - cameraFrame.p))
# 2. Complete:
# 3. Convert orange point to cylindrical coordinates
orange_X, orange_Y, orange_Z = orangePoint
orange_R = math.sqrt(orange_X ** 2 + orange_Y ** 2)
orange_Theta = math.atan2(orange_Y, orange_X) # Theta measured from global positive X axis
# 3. Complete: (above)
print("Orange R: {} Theta: {}".format(orange_R, math.degrees(orange_Theta)))
# 4. Solve for J2 and J3 in the idealized R-Z plane
targetVectorOrig = PyKDL.Vector(0, orange_R, orange_Z)
plotter.addVector(targetVectorOrig, "targetRZOrig")
# First, remove the fixed offsets from the wrist, elbow, and shoulder pieces
wristEndFrame = PyKDL.Frame()
wristStartFrame = PyKDL.Frame()
elbowEndFrame = PyKDL.Frame()
elbowStartFrame = PyKDL.Frame()
shoulderEndFrame = PyKDL.Frame()
shoulderStartFrame = PyKDL.Frame()
chainFK.JntToCart(chainAngles, wristEndFrame, segmentNr=7)
chainFK.JntToCart(chainAngles, wristStartFrame, segmentNr=5)
chainFK.JntToCart(chainAngles, elbowEndFrame, segmentNr=4)
chainFK.JntToCart(chainAngles, elbowStartFrame, segmentNr=3)
chainFK.JntToCart(chainAngles, shoulderEndFrame, segmentNr=2)
chainFK.JntToCart(chainAngles, shoulderStartFrame, segmentNr=0)
plotter.addVector(wristEndFrame.p, "wristEndFrame")
plotter.addVector(wristStartFrame.p, "wristStartFrame")
plotter.addVector(elbowEndFrame.p, "elbowEndFrame")
plotter.addVector(elbowStartFrame.p, "elbowStartFrame")
plotter.addVector(shoulderEndFrame.p, "shoulderEndFrame")
plotter.addVector(shoulderStartFrame.p, "shoulderStartFrame")
wristOffset = wristEndFrame.p - wristStartFrame.p
elbowOffset = elbowEndFrame.p - elbowStartFrame.p
shoulderOffset = shoulderEndFrame.p - shoulderStartFrame.p
targetVector = targetVectorOrig - wristOffset - elbowOffset - shoulderOffset
plotter.addVector(targetVector, "targetRZ")
# The following steps use the same labels as the classic 2D planar IK derivation
a1, a2 = (shoulderEndFrame.p - elbowStartFrame.p).Norm(), (elbowEndFrame.p - wristStartFrame.p).Norm()
_, ik_x, ik_y = targetVector
q2_a = math.acos((ik_x ** 2 + ik_y ** 2 - a1 ** 2 - a2 ** 2) / (2 * a1 * a2))
q1_a = math.atan2(ik_y, ik_x) - math.atan2(a2 * math.sin(-q2_a), a1 + a2 * math.cos(-q2_a))
q2_b = -1 * math.acos((ik_x ** 2 + ik_y ** 2 - a1 ** 2 - a2 ** 2) / (2 * a1 * a2))
q1_b = math.atan2(ik_y, ik_x) + math.atan2(a2 * math.sin(q2_b), a1 + a2 * math.cos(q2_b))
# Choose 'better' set of q1_ab, q2_ab
q1, q2 = q1_a, q2_a # TODO(JS): Is this always the better one?
J2_initial = q1
J2_offset = math.radians(90) # J2's zero position is vertical, not horizontal
J2 = J2_initial - J2_offset
# Since we have a parallel link, the angle for J3 is not simply q2. Instead, use transversal
J3_initial = q1 - q2
J3_offset = elbowStartFrame.M.GetRPY()[0] # J3's zero position is below horizontal
J3 = J3_initial - J3_offset
# 4. Complete (above)
# print("J2: {} J3: {}".format(J2, J3))
# 5. Use the Theta from cylindrical coordinates as the J1 angle, and update J5 accordingly
J1 = orange_Theta - math.radians(90)
J5 = J5_initial - (orange_Theta - math.radians(90))
# 5. Complete (above)
# print("J1: {} J5: {}".format(J1, J5))
jointAngles = [J1, J2, J3, J4, J5, J6]
jointNames = ['joint_base_rot', 'joint_rot_1', 'joint_f1_2', 'joint_f2_pitch', 'joint_pitch_yaw', 'joint_yaw_roll']
print("\n\nFinal joint angles in radians: {}\n\n".format(jointAngles))
in_bounds = True
for angle, name in zip(jointAngles, jointNames):
limit = robot_urdf.joint_map[name].limit
if angle < limit.lower or angle > limit.upper:
in_bounds = False
print("Joint {} out of bounds with angle {} not between {} and {}".format(name, angle, limit.lower, limit.upper))
print("All in bounds: {}".format(in_bounds))
solvedJoints = PyKDL.JntArray(8)
solvedJoints[0], solvedJoints[1], solvedJoints[3], solvedJoints[5], solvedJoints[6], solvedJoints[7] = jointAngles
solvedJoints[2], solvedJoints[4] = -1 * solvedJoints[1], -1 * solvedJoints[3]
producedFrame = PyKDL.Frame()
for i in range(chain.getNrOfSegments()):
rc = chainFK.JntToCart(solvedJoints, producedFrame, segmentNr=i)
plotter.addVector(producedFrame.p, "fk_produced_{}".format(i))
# print("Result: {}".format(rc))
# print("Output position: {}\nExpected position: {}".format(producedFrame.p, targetFrame.p))
# print("Output orientation: {}\nExpected orientation: {}".format(producedFrame.M, targetFrame.M))
# 6. (optional) Sanity check on solution:
# sanityTest(BASE_TO_BASE_YAW, BASE_YAW_TO_BOTTOM_4B, targetFrame, cameraFrame, cameraOffsetChain, jointAngles)
# 7. Create JointState message for return
ret = JointState()
ret.name = ['joint_base_rot', 'joint_rot_1', 'joint_f1_2', 'joint_f2_pitch', 'joint_pitch_yaw', 'joint_yaw_roll']
ret.header.stamp = rospy.Time.now()
ret.position = jointAngles
plotter.addGoal(ret)
# plotter.publish()
return in_bounds, ret
def testIK():
rate = rospy.Rate(10)
while not rospy.is_shutdown():
tf_listener.waitForTransform("base", "link_camera", rospy.Time(), rospy.Duration(10.0))
targetFrame_tf = tf_listener.lookupTransform('base', 'link_camera', rospy.Time(0))
targetFrame = posemath.fromTf(targetFrame_tf)
print("Input quaternion: {}".format(targetFrame.M.GetQuaternion()))
print("Input vector: {}".format(targetFrame.p))
jointAngles = solveIK(targetFrame)
jointAngles = solveIK(targetFrame)
print("sent!")
if __name__ == "__main__":
rospy.init_node("custom_ik_solver", anonymous=True)
s = rospy.Service("solve_ik", SolveIKSrv, callback)
robot_urdf = URDF.from_parameter_server()
# pub_orange = rospy.Publisher("orange_pub", PointStamped, queue_size=10)
# pub_pink = rospy.Publisher("pink_pub", PoseStamped, queue_size=10)
# pub_brown = rospy.Publisher("brown_pub", PointStamped, queue_size=10)
# pub_purple = rospy.Publisher("purple_pub", PointStamped, queue_size=10)
# pub_array = []
# testIK()
print("Listening for IK requests...")
rospy.spin()
This script is a service, listening for requests specified with a PoseStamped target pose for the ROBOCAM end effector's camera attachment. Upon receiving such a request, a callback is initiated, which begins with the creation of a PyKDL chain of 6 joints (3 controlling the motion of the arm as a whole, and 3 controlling the end effector wrist position and orientation directly), generated directly from the robot's URDF in robocam.xml. From here, the target pose's orientation quaternion is converted to XYZ-order Euler angles, which are offset to comply with range of motion limitations and transformed into degrees. These angles are then set to J4, J5, and J6 - angles for the 3 aforementioned end effector joints.
Next, a sub-chain is created with PyKDL consisting of those now-solved-for joints, and the solved joint angles are set. From there, Forward Kinematics are run in order to determine what positional offset must be subtracted from the overall goal in order to set a target for the first 3 joints of the arm (those associated with the shoulder and the elbow), as described earlier. That new target is converted to cylindrical coordinates of the form (θ , R, Z), which allows for mathematical distinction between the rotation required about the shoulder and planar motion about the R-Z plane as we compute the remaining joint angles. We then adjust this RZ-plane target by fixed offsets extracted from the URDF, in order to achieve the same working scenario as the following image:
As shown, the IK solutions for this simple 2-link chain can be derived from basic trigonometry, which correspond the solutions for J2 and J3 in our full ROBOCAM chain - the joints for the planar motion of the arm itself. Then, J1 - the joint angle corresponding to rotation of the full arm about its base - is directly given by the θ computed earlier. The earlier solution J5 must also be adjusted by θ in the opposite direction in order to account for this full-arm rotation.
From here, the solved angles [J1, J2, J3, J4, J5, J6] are translated into a JointState message and sent as a response back to the client.
#!/usr/bin/env python
import math
import rospy
import rospkg
import tf2_ros
import numpy as np
from geometry_msgs.msg import (
Pose,
Point,
Twist,
TransformStamped
)
from tf.transformations import quaternion_from_euler, quaternion_multiply
def register_frame(pose):
tf2Stamp = TransformStamped()
tf2Stamp.header.stamp = rospy.Time.now()
tf2Stamp.header.frame_id = 'world'
tf2Stamp.child_frame_id = 'tracked_object'
tf2Stamp.transform.translation.x, tf2Stamp.transform.translation.y, tf2Stamp.transform.translation.z = pose.position.x, pose.position.y, pose.position.z
tf2Stamp.transform.rotation.x, tf2Stamp.transform.rotation.y, tf2Stamp.transform.rotation.z, tf2Stamp.transform.rotation.w = pose.orientation.x, pose.orientation.y, pose.orientation.z, pose.orientation.w
tf2Broadcast.sendTransform(tf2Stamp)
def simulate_movement(reference_frame="world", max_speed=1, refresh_rate=30, update_rate=1, x_min=-0.2, y_min=-0.2, x_max=0.2, y_max=0.2):
rate = rospy.Rate(refresh_rate)
try:
curr_pose, speed, theta = Pose(), 0, 0
curr_pose.orientation.w = 1
i = 0
while not rospy.is_shutdown():
if i % (update_rate * refresh_rate) == 0:
speed, theta = np.random.uniform(0, max_speed), np.random.uniform(0, 2 * math.pi)
curr_pose.orientation.x, curr_pose.orientation.y, curr_pose.orientation.z, curr_pose.orientation.w = quaternion_from_euler(0, 0, 0)
curr_pose.position.x = max(x_min, min(x_max, curr_pose.position.x + np.cos(theta) * speed / refresh_rate))
curr_pose.position.y = max(y_min, min(y_max, curr_pose.position.y + np.sin(theta) * speed / refresh_rate))
register_frame(curr_pose)
i += 1
rate.sleep()
except rospy.ROSInterruptException, e:
print("ROS interrupt: {0}".format(e))
def main():
rospy.init_node("movement_simulator")
print("Running simulation")
simulate_movement()
if __name__ == '__main__':
tf2Broadcast = tf2_ros.TransformBroadcaster()
main()
This script is a simple simulator for spontaneous movement of an arbitrary subject to be tracked, which was used in simulation to test our sensing and planning capacities while the ROBOCAM arm was being built in parallel. It simply computes a random pose within specified limits and moves towards it, broadcasting the corresponding frame until the script is stopped.
#include <iostream>
#include <algorithm>
#include <OGRE/OgreSceneNode.h>
#include <OGRE/OgreSceneManager.h>
#include <OGRE/OgreEntity.h>
#include <ros/console.h>
#include <ros/ros.h>
#include <rviz/viewport_mouse_event.h>
#include <rviz/visualization_manager.h>
#include <rviz/view_manager.h>
#include <rviz/mesh_loader.h>
#include <rviz/geometry.h>
#include <rviz/properties/property.h>
#include <rviz/properties/vector_property.h>
#include <rviz/properties/quaternion_property.h>
#include <rviz/properties/string_property.h>
#include <rviz/properties/bool_property.h>
#include <rviz/properties/float_property.h>
#include <rviz/properties/int_property.h>
#include "keyframe_tool.h"
#include "rviz_animator/KeyframeMsg.h"
#include "rviz_animator/KeyframesMsg.h"
namespace rviz_animator
{
int KeyframeTool::current_keyframe_id = 0;
KeyframeTool::KeyframeTool()
{
ros::NodeHandle nh;
shortcut_key_ = 'k';
pub_ = nh.advertise<KeyframesMsg>("/operator_keyframes", 1);
}
KeyframeTool::~KeyframeTool()
{
for (auto keyframe : keyframes_)
{
scene_manager_->destroySceneNode(keyframe.node_);
}
}
void KeyframeTool::onInitialize()
{
keyframe_resource_ = "package://rviz_animator/models/camera.dae";
if( rviz::loadMeshFromResource( keyframe_resource_ ).isNull() )
{
ROS_ERROR( "KeyframeTool: failed to load model resource '%s'.", keyframe_resource_.c_str() );
return;
}
keyframes_property_ = new rviz::Property("Keyframes");
topic_property_ = new rviz::StringProperty("Topic", "/operator_keyframes");
publish_property_ = new rviz::BoolProperty("Publish?", false);
preview_property_ = new rviz::BoolProperty("Preview?", false);
getPropertyContainer()->addChild( topic_property_ );
// getPropertyContainer()->addChild( preview_property_ ); // TODO(JS): Fix
getPropertyContainer()->addChild( publish_property_ );
getPropertyContainer()->addChild( keyframes_property_ );
connect(preview_property_, &rviz::BoolProperty::changed, this, &KeyframeTool::previewKeyframes);
connect(publish_property_, &rviz::BoolProperty::changed, this, &KeyframeTool::publishKeyframes);
}
void KeyframeTool::activate() {}
void KeyframeTool::deactivate() {}
int KeyframeTool::processMouseEvent( rviz::ViewportMouseEvent& event )
{
Ogre::Camera* camera = scene_manager_->getCurrentViewport()->getCamera();
auto node = createNode(camera->getPosition(), camera->getOrientation());
auto keyframe = KeyframeStruct{node, camera->getPosition(), camera->getOrientation(), 0, current_keyframe_id, "keyframe_id_" + std::to_string(current_keyframe_id)};
keyframes_.push_back(keyframe);
current_keyframe_id++;
rviz::Property* keyframe_property = new rviz::Property("Keyframe " + QString::number(keyframes_property_->numChildren()));
rviz::BoolProperty* keyframe_active_property = new rviz::BoolProperty("Active?", true, "", keyframe_property);
rviz::IntProperty* keyframe_id_property = new rviz::IntProperty("ID", keyframe.id_, "", keyframe_property);
rviz::StringProperty* keyframe_label_property = new rviz::StringProperty("Label", QString::fromStdString(keyframe.label_), "", keyframe_property);
rviz::VectorProperty* keyframe_position_property = new rviz::VectorProperty("Position", keyframe.position_, "", keyframe_property);
rviz::QuaternionProperty* keyframe_orientation_property = new rviz::QuaternionProperty("Orientation", keyframe.orientation_, "", keyframe_property);
rviz::FloatProperty* keyframe_timestamp_property = new rviz::FloatProperty("Timestamp", keyframe.timestamp_, "", keyframe_property);
int property_index = keyframes_property_->numChildren();
connect(keyframe_position_property, &rviz::VectorProperty::changed, this, [=](){ updateKeyframePosition(property_index); });
connect(keyframe_orientation_property, &rviz::QuaternionProperty::changed, this, [=](){ updateKeyframeOrientation(property_index); });
connect(keyframe_timestamp_property, &rviz::FloatProperty::changed, this, [=](){ updateKeyframeTimestamp(property_index); });
connect(keyframe_label_property, &rviz::StringProperty::changed, this, [=](){ updateKeyframeLabel(property_index); });
connect(keyframe_active_property, &rviz::BoolProperty::changed, this, [=](){ deleteKeyframe(property_index); });
keyframe_id_property->setReadOnly(true);
keyframes_property_->addChild( keyframe_property );
return Render | Finished;
}
void KeyframeTool::save( rviz::Config config ) const
{
config.mapSetValue( "Class", getClassId() );
rviz::Config keyframe_tool_config = config.mapMakeChild( "Keyframe Tool" );
rviz::Config keyframe_topic_config = keyframe_tool_config.mapMakeChild("Topic");
topic_property_->save( keyframe_topic_config );
keyframe_tool_config.mapSetValue("current_keyframe_id", current_keyframe_id);
rviz::Config keyframes_config = keyframe_tool_config.mapMakeChild( "Keyframes" );
int num_children = keyframes_property_->numChildren();
for( int i = 0; i < num_children; i++ )
{
rviz::Property* keyframe_prop = keyframes_property_->childAt( i );
rviz::Config keyframe_config = keyframes_config.listAppendNew();
keyframe_config.mapSetValue( "Name", keyframe_prop->getName() );
for (int j = 0; j < keyframe_prop->numChildren(); j++) {
rviz::Property* keyframe_aspect_prop = keyframe_prop->childAt(j);
rviz::Config keyframe_aspect_config = keyframe_config.mapMakeChild(keyframe_aspect_prop->getName());
keyframe_aspect_prop->save( keyframe_aspect_config );
}
}
}
void KeyframeTool::load( const rviz::Config& config )
{
rviz::Config keyframe_tool_config = config.mapGetChild( "Keyframe Tool" );
rviz::Config keyframe_topic_config = keyframe_tool_config.mapGetChild("Topic");
keyframe_tool_config.mapGetInt("current_keyframe_id", ¤t_keyframe_id);
topic_property_->load(keyframe_topic_config);
rviz::Config keyframes_config = keyframe_tool_config.mapGetChild( "Keyframes" );
int num_keyframes = keyframes_config.listLength();
for( int i = 0; i < num_keyframes; i++ )
{
rviz::Config keyframe_config = keyframes_config.listChildAt( i );
QString name = "Keyframe " + QString::number( i );
keyframe_config.mapGetString( "Name", &name );
rviz::Property* keyframe_property = new rviz::Property(name);
rviz::VectorProperty* keyframe_position_property = new rviz::VectorProperty("Position");
rviz::QuaternionProperty* keyframe_orientation_property = new rviz::QuaternionProperty("Orientation");
rviz::FloatProperty* keyframe_timestamp_property = new rviz::FloatProperty("Timestamp");
rviz::StringProperty* keyframe_label_property = new rviz::StringProperty("Label");
rviz::IntProperty* keyframe_id_property = new rviz::IntProperty("ID");
rviz::BoolProperty* keyframe_active_property = new rviz::BoolProperty("Active?", true);
keyframe_position_property->load( keyframe_config.mapGetChild("Position") );
keyframe_orientation_property->load( keyframe_config.mapGetChild("Orientation") );
keyframe_timestamp_property->load( keyframe_config.mapGetChild("Timestamp") );
keyframe_label_property->load( keyframe_config.mapGetChild("Label") );
keyframe_id_property->load( keyframe_config.mapGetChild("ID") );
int property_index = keyframes_property_->numChildren();
connect(keyframe_position_property, &rviz::VectorProperty::changed, this, [=](){ updateKeyframePosition(property_index); });
connect(keyframe_orientation_property, &rviz::QuaternionProperty::changed, this, [=](){ updateKeyframeOrientation(property_index); });
connect(keyframe_timestamp_property, &rviz::FloatProperty::changed, this, [=](){ updateKeyframeTimestamp(property_index); });
connect(keyframe_label_property, &rviz::StringProperty::changed, this, [=](){ updateKeyframeLabel(property_index); });
connect(keyframe_active_property, &rviz::BoolProperty::changed, this, [=](){ deleteKeyframe(property_index); });
keyframe_property->addChild( keyframe_active_property );
keyframe_property->addChild( keyframe_id_property );
keyframe_property->addChild( keyframe_label_property );
keyframe_property->addChild( keyframe_position_property );
keyframe_property->addChild( keyframe_orientation_property );
keyframe_property->addChild( keyframe_timestamp_property );
keyframes_property_->addChild( keyframe_property );
auto node = createNode( keyframe_position_property->getVector(), keyframe_orientation_property->getQuaternion() );
auto keyframe = KeyframeStruct{node, keyframe_position_property->getVector(), keyframe_orientation_property->getQuaternion(),
keyframe_timestamp_property->getFloat(), keyframe_id_property->getInt(), keyframe_label_property->getStdString()};
keyframes_.push_back(keyframe);
}
}
void KeyframeTool::sortKeyframes()
{
std::sort(keyframes_.begin(), keyframes_.end());
renderKeyframeProperties();
}
void KeyframeTool::renderKeyframeProperties() {
for (int i = 0; i < keyframes_.size(); ++i) {
auto keyframe = keyframes_[i];
auto keyframe_property = keyframes_property_->childAt(i);
dynamic_cast<rviz::VectorProperty*>(keyframe_property->subProp("Position"))->setVector(keyframe.position_);
dynamic_cast<rviz::QuaternionProperty*>(keyframe_property->subProp("Orientation"))->setQuaternion(keyframe.orientation_);
dynamic_cast<rviz::FloatProperty*>(keyframe_property->subProp("Timestamp"))->setFloat(keyframe.timestamp_);
dynamic_cast<rviz::StringProperty*>(keyframe_property->subProp("Label"))->setStdString(keyframe.label_);
dynamic_cast<rviz::IntProperty*>(keyframe_property->subProp("ID"))->setInt(keyframe.id_);
dynamic_cast<rviz::BoolProperty*>(keyframe_property->subProp("Active?"))->setBool(true);
}
}
void KeyframeTool::previewKeyframes() {
if (!preview_property_->getBool()) return;
context_->getViewManager()->setCurrentViewControllerType("rviz/FPS");
auto view_controller = context_->getViewManager()->getCurrent();
auto keyframe = keyframes_[0];
view_controller->subProp("Position")->subProp("X")->setValue(keyframe.position_[0]);
view_controller->subProp("Position")->subProp("Y")->setValue(keyframe.position_[1]);
view_controller->subProp("Position")->subProp("Z")->setValue(keyframe.position_[2]);
}
void KeyframeTool::publishKeyframes() {
if (!publish_property_->getBool()) return;
KeyframesMsg keyframes_msg;
keyframes_msg.keyframes.clear();
keyframes_msg.num_keyframes = keyframes_.size();
for (int i = 0; i < keyframes_.size(); ++i) {
KeyframeStruct keyframe = keyframes_[i];
KeyframeMsg keyframe_msg;
keyframe_msg.frame.position.x = keyframe.position_[0];
keyframe_msg.frame.position.y = keyframe.position_[1];
keyframe_msg.frame.position.z = keyframe.position_[2];
keyframe_msg.frame.orientation.x = keyframe.orientation_[0];
keyframe_msg.frame.orientation.y = keyframe.orientation_[1];
keyframe_msg.frame.orientation.z = keyframe.orientation_[2];
keyframe_msg.frame.orientation.w = keyframe.orientation_[3];
keyframe_msg.timestamp = keyframe.timestamp_;
keyframes_msg.keyframes.push_back(keyframe_msg);
}
pub_.publish(keyframes_msg);
}
void KeyframeTool::updateKeyframePosition(int property_index)
{
auto& keyframe = keyframes_[property_index];
keyframe.position_ = dynamic_cast<rviz::VectorProperty*>(keyframes_property_->childAt(property_index)->subProp("Position"))->getVector();
keyframe.node_->setPosition(keyframe.position_);
}
void KeyframeTool::updateKeyframeOrientation(int property_index)
{
auto& keyframe = keyframes_[property_index];
keyframe.orientation_ = dynamic_cast<rviz::QuaternionProperty*>(keyframes_property_->childAt(property_index)->subProp("Orientation"))->getQuaternion();
keyframe.node_->setOrientation(keyframe.orientation_);
}
void KeyframeTool::updateKeyframeTimestamp(int property_index)
{
auto& keyframe = keyframes_[property_index];
keyframe.timestamp_ = dynamic_cast<rviz::FloatProperty*>(keyframes_property_->childAt(property_index)->subProp("Timestamp"))->getFloat();
sortKeyframes();
}
void KeyframeTool::updateKeyframeLabel(int property_index)
{
auto& keyframe = keyframes_[property_index];
keyframe.label_ = dynamic_cast<rviz::StringProperty*>(keyframes_property_->childAt(property_index)->subProp("Label"))->getStdString();
}
void KeyframeTool::deleteKeyframe(int property_index) {
if (keyframes_property_->childAt(property_index)->subProp("Active?")->getValue().toBool()) return;
auto& keyframe = keyframes_[property_index];
scene_manager_->destroySceneNode(keyframe.node_);
keyframes_.erase(keyframes_.begin() + property_index);
keyframes_property_->removeChildren(keyframes_property_->numChildren() - 1, 1);
renderKeyframeProperties();
}
Ogre::SceneNode* KeyframeTool::createNode( const Ogre::Vector3& position, const Ogre::Quaternion& orientation )
{
Ogre::SceneNode* node = scene_manager_->getRootSceneNode()->createChildSceneNode();
Ogre::Entity* entity = scene_manager_->createEntity( keyframe_resource_ );
Ogre::Camera* camera = scene_manager_->getCurrentViewport()->getCamera();
node->attachObject( entity );
node->setPosition(position);
node->setOrientation(orientation);
node->setVisible( true );
return node;
}
} // end namespace rviz_animator
#include <pluginlib/class_list_macros.h>
PLUGINLIB_EXPORT_CLASS(rviz_animator::KeyframeTool,rviz::Tool )
This script, written in C++, creates a Keyframe plugin in RViz to be used by the ROBOCAM operator. By building custom extensions upon the Tool and Property classes built into RViz, this script loads a new plugin into RViz that allows the ROBOCAM operator to specify Keyframes, which are goal poses for the robot arm and the timestamps when each of those poses should be reached. The operator creates a Keyframe by pressing the "K" key on their keyboard, and are also able to modify or delete the Keyframe (namely, they may adjust the position, orientation, label, timestamp) in the plugin viewing pane. They may publish their created sequence of Keyframes with the Publish checkbox in the plugin viewing pane, and change the topic to which those Keyframes are published as per their use case.
The field of view and details pane associated with the Keyframe plugin appear as shown here:
#!/usr/bin/env python
import sys
import time
import math
import PyKDL
import rospy
import rospkg
import tf2_ros
import threading
import warnings
import traceback
import numpy as np
import kdl_parser_py.urdf as parser
from urdf_parser_py.urdf import URDF
warnings.filterwarnings('ignore', category=UserWarning)
import quaternion
from tf_conversions import posemath
from tf.transformations import quaternion_from_euler, quaternion_multiply
from scipy.interpolate import interp1d
from rviz_animator.msg import KeyframesMsg
from tf2_geometry_msgs import do_transform_pose
from geometry_msgs.msg import (
Point,
Quaternion,
Pose,
PoseArray,
PoseStamped,
Transform
)
from sensor_msgs.msg import (
JointState
)
from std_msgs.msg import (
Float32MultiArray
)
from ik_solver.srv import (
SolveIKSrv
)
def createSequence(keyframes, dt=0.1):
timestamps = [frame.timestamp for frame in keyframes]
rotations = quaternion.as_quat_array([[frame.frame.orientation.x, frame.frame.orientation.y, frame.frame.orientation.z, frame.frame.orientation.w] for frame in keyframes])
# Coordinate frame conversion
multiplier_quat = quaternion.from_float_array([np.sqrt(2) / 2, -np.sqrt(2) / 2, 0, 0])
rotations = [rot * multiplier_quat for rot in rotations]
positions = np.array([[frame.frame.position.x, frame.frame.position.y, frame.frame.position.z] for frame in keyframes])
time_interps = list(np.arange(timestamps[0], timestamps[-1], dt)) + [timestamps[-1]]
lerp = interp1d(timestamps, positions, axis=0)
lin_interps = lerp(time_interps)
rot_interps = []
for i in range(len(keyframes) - 1):
rot_interps.extend(quaternion.as_float_array(quaternion.slerp(rotations[i], rotations[i + 1], timestamps[i], timestamps[i + 1], np.arange(timestamps[i], timestamps[i + 1], dt))))
rot_interps.append(quaternion.as_float_array(rotations[-1]))
sequence = []
for pos, ori in zip(lin_interps, rot_interps):
pose = Pose()
pose.position.x, pose.position.y, pose.position.z = pos
pose.orientation.w, pose.orientation.x, pose.orientation.y, pose.orientation.z = ori
pose_stamped = PoseStamped()
pose_stamped.pose = pose
sequence.append(pose_stamped)
g_poses[0] = sequence[-1].pose
g_poses[1] = sequence[-1].pose
return time_interps, sequence
def callback(message):
global g_joint_angles
time_interps, sequence = createSequence(message.keyframes)
print("Getting reference frames...")
tf_buffer = tf2_ros.Buffer(rospy.Duration(1200))
tf_listener = tf2_ros.TransformListener(tf_buffer)
transform_object_to_base = None
while not transform_object_to_base:
try:
transform_object_to_base = tf_buffer.lookup_transform('base', 'tracked_object', rospy.Time(0))
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
continue
print("Beginning playback...")
start = time.time()
for t, pose in zip(time_interps, sequence):
while time.time() - start < t:
continue
try:
transform_object_to_base = tf_buffer.lookup_transform('base', 'tracked_object', rospy.Time(0))
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
print("Failed to get object to base transform at timestep {} | Real Time: {}".format(t, time.time() - start))
continue
target_pose = do_transform_pose(pose, transform_object_to_base)
# quat_rot = quaternion_from_euler(0, 0, math.pi/2)
# target_pose.pose.orientation.x, target_pose.pose.orientation.y, target_pose.pose.orientation.z, target_pose.pose.orientation.w = quaternion_multiply(quat_rot, [target_pose.pose.orientation.x, target_pose.pose.orientation.y, target_pose.pose.orientation.z, target_pose.pose.orientation.w])
try:
solve_ik = rospy.ServiceProxy('solve_ik', SolveIKSrv)
# print(target_pose)
response = solve_ik(target_pose)
if response.solved:
g_joint_angles = response.output_joints.position
print("Setting joint state: {}".format(g_joint_angles))
except rospy.ServiceException as e:
print("Service call failed: %s"%e)
print("Finished playback.")
def joint_publisher():
rate = rospy.Rate(5)
while not rospy.is_shutdown():
publish_joints = JointState()
publish_joints.name = ['joint_base_rot', 'joint_rot_1', 'joint_f1_2', 'joint_f2_pitch', 'joint_pitch_yaw', 'joint_yaw_roll']
publish_joints.position = g_joint_angles
publish_joints.header.stamp = rospy.Time.now()
pub_joints.publish(publish_joints)
poseArray = PoseArray()
poseArray.header.stamp = rospy.Time.now()
poseArray.header.frame_id = "world"
poseArray.poses = g_poses
pub_rviz.publish(poseArray)
rate.sleep()
if __name__ == '__main__':
rospy.init_node('choreographer', anonymous=True)
pub_joints = rospy.Publisher('joint_states', JointState, queue_size=10)
g_joint_angles = [0] * 6
pub_rviz = rospy.Publisher('target_poses', PoseArray, queue_size=10)
g_poses = [Pose(), Pose()]
joint_pub_thread = threading.Thread(target=joint_publisher)
joint_pub_thread.start()
print("Waiting for IK service...")
rospy.wait_for_service('solve_ik')
print("Ready for keyframes!")
rospy.Subscriber("operator_keyframes", KeyframesMsg, callback)
rospy.spin()
This script is both a Subscriber and a Publisher, listening for operator-defined Keyframes from the custom RViz Keyframe plugin and publishing JointState messages to be received on the RPi/Arduino level to initiate actuation of the ROBOCAM. Upon receiving Keyframes on its subscribed topic, this script starts its callback sequence, beginning by creating time-specific waypoints between Keyframes. Specifically, between each pair of Keyframes in the received list of Keyframes, the script does pose interpolation - broken down into spherical linear interpolation at the orientation quaternion level and simple linear interpolation and the translation vector level - at every dt timestamp between the pose of the first Keyframe in the pair and that of the second Keyframe in the pair over the course of their respective timestamp differential (e.g. K1 reached at 10s and K2 reached at 20s, waypoints generated every dt=0.1s between K1's pose and K2's pose from t=10s to t=20s).
Once this waypoint sequence has been created, this script begins iterating over each waypoint. Each iteration begins with a TF lookup of the current transform between the frame of the tracked object as broadcast by objTracker.py and the frame of the camera affixed on the robot arm's end effector. The current iteration's waypoint pose is then transformed with respect to this tracked subject transform, at which point the transformed pose is sent as part of a request to the Custom IK service as defined in CustomIKService.py. The script receives a JointState message response from the service, which it then publishes to a specified topic, which is being listened to at the RPi level and then communicated to the Arduino for actuation.
<?xml version="1.0"?>
<robot name="robocam">
<!-- Robot Links -->
<link name="base">
<visual>
<geometry>
<cylinder length=".07" radius="0.03"/>
</geometry>
<origin xyz="0 0 .0325" rpy="0 0 0" />
</visual>
<collision>
<geometry>
<cylinder length=".07" radius="0.03"/>
</geometry>
<origin xyz="0 0 .0325" rpy="0 0 0" />
</collision>
</link>
<link name="link_rot">
<visual>
<geometry>
<cylinder length=".07" radius="0.03"/>
</geometry>
<origin xyz="0 0 .03" rpy="0 0 0" />
<material name='blue'>
<color rgba="0 0 0.8 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 0 .03" rpy="0 0 0"/> <mass value=".6" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".07" radius="0.03"/>
</geometry>
<origin xyz="0 0 .03" rpy="0 0 0" />
</collision>
</link>
<link name="link_1">
<visual>
<geometry>
<cylinder length=".336" radius="0.005"/>
</geometry>
<origin xyz="0 0 .1685" rpy="0 0 0" />
</visual>
<inertial>
<origin xyz="0 0 .1685" rpy="0 0 0"/>
<mass value="1.5" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".336" radius="0.005"/>
</geometry>
<origin xyz="0 0 .1685" rpy="0 0 0" />
</collision>
</link>
<link name="link_1_mimic">
<visual>
<geometry>
<cylinder length=".336" radius="0.005"/>
</geometry>
<origin xyz="0 0 .1685" rpy="0 0 0" />
<material name='yellow'>
<color rgba="0.9 0.9 0 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 0 .1685" rpy="0 0 0"/>
<mass value="1.5" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".336" radius="0.005"/>
</geometry>
<origin xyz="0 0 .171" rpy="0 0 0" />
</collision>
</link>
<link name="link_f1">
<visual>
<geometry>
<cylinder length=".06" radius="0.039"/>
</geometry>
<origin xyz="0 0 .03" rpy="0 0 0" />
<material name='black'>
<color rgba="0 0 0 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 0 .03" rpy="0 0 0"/>
<mass value=".8" /> <inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".06" radius="0.039"/>
</geometry>
<origin xyz="0 0 .029" rpy="0 0 0" />
</collision>
</link>
<link name="link_2">
<visual>
<geometry>
<cylinder length=".34" radius="0.005"/> </geometry>
<origin xyz="0 0.17 0" rpy="1.57 0 0" /> <material name='blue'>
<color rgba="0 0 0.8 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 .17 0" rpy="0 0 0"/>
<mass value="1.5" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".34" radius="0.005"/>
</geometry>
<origin xyz="0 .17 0" rpy="0 0 0" />
</collision>
</link>
<link name="link_2_mimic">
<visual>
<geometry>
<cylinder length=".34" radius="0.005"/>
</geometry>
<origin xyz="0 .17 0" rpy="1.57 0 0" />
<material name='yellow'>
<color rgba="0.9 .9 0 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 .17 0" rpy="0 0 0"/>
<mass value="1.5" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".34" radius="0.005"/>
</geometry>
<origin xyz="0 0.17 0" rpy="0 0 0" />
</collision>
</link>
<link name="link_f2">
<visual>
<geometry>
<cylinder length=".07" radius="0.03"/>
</geometry>
<origin xyz="0 .035 0" rpy="1.57 0 0" />
<material name='black'>
<color rgba="0 0 0 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 .035 0" rpy="0 0 0"/>
<mass value=".8" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".07" radius="0.03"/>
</geometry>
<origin xyz="0 0.035 0" rpy="0 0 0" />
</collision>
</link>
<link name="link_pitch">
<visual>
<geometry>
<cylinder length=".051" radius="0.0196"/>
</geometry>
<origin xyz="0 .0255 0" rpy="1.57 0 0" />
</visual>
<inertial>
<origin xyz="0 0 .0255" rpy="0 0 0"/>
<mass value=".2" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".051" radius="0.0196"/>
</geometry>
<origin xyz="0 0 .0255" rpy="0 0 0" />
</collision>
</link>
<link name="link_yaw">
<visual>
<geometry>
<cylinder length=".02475" radius="0.01996"/>
</geometry>
<origin xyz="0 0 0" rpy="0 0 0" />
<material name='blue'>
<color rgba="0 0 0.8 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 0 .0124" rpy="0 0 0"/>
<mass value=".2" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".02475" radius="0.01996"/>
</geometry>
<origin xyz="0 0 .0124" rpy="0 0 0" />
</collision>
</link>
<link name="link_roll">
<visual>
<geometry>
<box size="0.04588 0.04588 0.007"/>
</geometry>
<origin xyz="0 0 .0015" rpy="1.57 0 0" />
</visual>
<inertial>
<origin xyz="0 0 .0015" rpy="0 0 0"/>
<mass value=".2" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<box size="0.04588 0.04588 0.007"/>
</geometry>
<origin xyz="0 0 .0015" rpy="0 0 0" />
</collision>
</link>
<link name="link_camera">
<visual>
<geometry>
<cylinder length=".005" radius="0.02"/>
</geometry>
<origin xyz="0 0 .001" rpy="1.57 0 0" />
<material name='green'>
<color rgba="0 0.8 0 1"/>
</material>
</visual>
<inertial>
<origin xyz="0 0 .001" rpy="0 0 0"/>
<mass value=".2" />
<inertia ixx=".001" ixy="0" ixz="0" iyy=".001" iyz="0" izz=".001" />
</inertial>
<collision>
<geometry>
<cylinder length=".02" radius="0.02"/>
</geometry>
<origin xyz="0 0 .001" rpy="0 0 0" />
</collision>
</link>
<!-- Robot Joints -->
<joint name="joint_base_rot" type="revolute">
<parent link="base"/>
<child link="link_rot"/>
<origin xyz="0 0 .07" rpy="0 0 0" />
<axis xyz="0 0 1" />
<limit effort="1000.0" lower="-1.57" upper="1.57" velocity="1000"/>
</joint>
<joint name="joint_rot_1" type="revolute">
<parent link="link_rot"/>
<child link="link_1"/>
<origin xyz="0 0.012 .041" rpy="0 0 0" />
<axis xyz="1 0 0" />
<limit effort="1000.0" lower="-.3927" upper="0.3927" velocity="1000"/>
</joint>
<joint name="joint_rot_1_mimic" type="revolute">
<parent link="link_rot"/>
<child link="link_1_mimic"/>
<origin xyz="0 -0.012 .07" rpy="0 0 0" />
<axis xyz="1 0 0" />
<limit effort="1000.0" lower="-.3927" upper="0.3927" velocity="1000"/>
<mimic joint="joint_rot_1" multiplier="1" offset="0"/>
</joint>
<joint name="joint_1_f1" type="continuous">
<parent link="link_1"/>
<child link="link_f1"/>
<origin xyz="0 0 .336" rpy="-0.5470 0 0" />
<axis xyz="1 0 0" />
<mimic joint="joint_rot_1" multiplier="-1" offset="0"/>
</joint>
<joint name="joint_f1_2" type="revolute">
<parent link="link_f1"/>
<child link="link_2"/>
<origin xyz="0 0 0.0374" rpy="0 0 0" />
<axis xyz="1 0 0" />
<limit effort="1000.0" lower="-.3927" upper=".3927" velocity="1000"/>
</joint>
<joint name="joint_f1_2_mimic" type="continuous">
<parent link="link_f1"/>
<child link="link_2_mimic"/>
<origin xyz="0 0.02 0.0174" rpy="0 0 0" />
<axis xyz="1 0 0" />
<mimic joint="joint_f1_2" multiplier="1" offset="0"/>
</joint>
<joint name="joint_2_f2" type="continuous">
<parent link="link_2"/>
<child link="link_f2"/>
<origin xyz="0 0.34 0" rpy="0 0 0" />
<axis xyz="1 0 0" />
<mimic joint="joint_f1_2" multiplier="-1" offset="0"/>
</joint>
<joint name="joint_f2_pitch" type="revolute">
<parent link="link_f2"/>
<child link="link_pitch"/>
<origin xyz="0 0.0464 0" rpy="0 0 0" />
<axis xyz="1 0 0" />
<limit effort="1000.0" lower="-1.57" upper="1.57" velocity="1000"/>
</joint>
<joint name="joint_pitch_yaw" type="revolute">
<parent link="link_pitch"/>
<child link="link_yaw"/>
<origin xyz="0 0.039 0.013" rpy="0 0 0" />
<axis xyz="0 0 1" />
<limit effort="1000.0" lower="-1.57" upper="1.57" velocity="1000"/>
</joint>
<joint name="joint_yaw_roll" type="revolute">
<parent link="link_yaw"/>
<child link="link_roll"/>
<origin xyz="0.005 0.023 0.016" rpy="0 0 0" />
<axis xyz="0 1 0" />
<limit effort="1000.0" lower="-1.57" upper="1.57" velocity="1000"/>
</joint>
<joint name="joint_roll_camera" type="fixed">
<parent link="link_roll"/>
<child link="link_camera"/>
<origin xyz="0 0.0177 0.0063" rpy="0 0 0" />
</joint>
</robot>
This is the custom URDF for the ROBOCAM. It consists of 6 revolute joints:
Rotation about shoulder
Planar motion of shoulder-affixed four-bar linkage
Planar motion of elbow-affixed four-bar linkage
Pitch motion for end-effector wrist
Yaw motion for end-effector wrist
Roll motion for end-effector wrist
This URDF also utilizes 4 continuous mimic joints in order to preserve fidelity to the actual structure of the ROBOCAM robot arm.
<launch>
<param name="robot_description" command="cat $(find rviz_animator)/models/robocam.xml" />
<node pkg="robot_state_publisher" type="robot_state_publisher" name="robot_state_publisher" />
<node pkg="tf2_ros" type="static_transform_publisher" name="base_to_world" args="0 -0.3 0 0 0 0 1 world base" />
<!-- <node pkg="joint_state_publisher" type="joint_state_publisher" name="joint_state_publisher">
<param name="use_gui" value="true"/>
</node> -->
<node pkg="aruco_pkg" type="objTracker.py" name="objTracker" output="screen"/>
<node name="rviz" pkg="rviz" type="rviz" args="-d $(find rviz_animator)/rviz/robocam.rviz" />
<node pkg="ik_solver" type="CustomIKSolver.py" name="ik_service" output="screen"/>
<node pkg="rviz_animator" type="choreographer.py" name="choreographer" output="screen"/>
</launch>
This ROBOCAM launch file is a simple file that sets the global robot_description parameter and initializes key nodes. Namely, it initializes:
A robot_state_publisher node to read and execute target JointState messages
A static_transform_publisher node to establish a relationship between the world frame and the base frame of the ROBOCAM
An objTracker node to begin tracking the nonstationary subject as specified in objTracker.py
An rviz node to open a pre-configurated (config specified at robocam.rviz) RViz viewing window for use by the operator
An ik_service node to start up the custom IK service as specified in CustomIKService.py, actively listening for target poses
A choreographer node to start up the waypoint interpolator as specified in choreographer.py, listening for keyframes