Lab 8 - Transformations

Why do we need transformations?

Robots employ many different coordinate systems (aka frames) to define their location, the location of their sensors and components, or the location of external entities. Transformations among those frames is necessary for the robot or a robot component to understand how to interpret data that is provided in a different frame. You may not have realize it but we have already used many frames and implicit transformations in our previous labs. For example, we use the world frame when sensing the drone position, attitude, and velocity to plan a path and implement the control commands to realize that path. We also considered the body frame when taking the accelerometer data from the IMU and we had to transform it into the world frame and integrate it to obtain a velocity. Frames and transformations are everywhere, so let's learn how to programmatically structure them in ROS.

Learning Objectives

At the end of this lab, you should understand:

  • The ROS support for transforms
  • How to publish a transform in ROS
  • How to use a ROS transform listener
  • How to use a ROS transform to perform a coordinate transformation

Before Starting

We will be using an updated version of the labsim simulator. To get started update the simulator:

$ cd ~/labsim
$ git pull
$ catkin build

Add the following line to the launch file:

<node name="tower" pkg="flightcontroller" type="tower.py" />

Challenge

In the previous lab defined the goal position for the drone using the command:

$ rostopic pub /uav/input/goal geometry_msgs/Vector3 '{x: -2, y: -1, z: 2.5}'

This command sends the drone a position in the world frame. The drone understood this and flew to that position.

For this lab, we will assume you have been hired by an online shopping company called **azon that has recently shown interest in delivering packages using drones. You are the lead engineer of the drone delivery system.

World Frame. In your first tests, you place the drone at the origin of the world frame and try to deliver packages to different houses. You start by delivering a package to the house on the bottom left of the figure. You find that all you need to do is send a Vector3 message to /uav/input/goal containing the coordinates [-4,-2] and set yaw = 0 to keep the drone facing the same direction.

Tower Frame. As you expand your operation, you realize that it would be easier to have a central tower than controls all the drones. The tower is placed at the top of a mountain with a frame that has +Y aligned with magnetic North. The tower then directs the drone to deliver to location corresponding to the same house . You send the drone to that location when all of a sudden the drone goes flying downwards and crashes into the house just below it. What happened?

You then remember what you learned about frames in this class. The reason the drone crashed was because your tower sent the position [5, -32] with a yaw = +pi/4. This position and yaw sent by the tower use the tower's reference frame. As you apologize to the house owners and pick up the remains of the drone, you realize two things. Firstly you should have transformed the coordinates sent by the tower from the tower's frame into the world frame. Secondly, you recall why the state and safety nodes we developed in earlier labs are so vital in the real world.

In this lab, a tower node is given to us. The tower will publish goals positions once a drone's trajectory is complete on the /tower/goal topic using the tower frame. Your job will be to use ROS transforms to convert the towers commands into the world frame so that the drone can successfully navigate to the goal positions sent to the drone.

The architecture of the overall system is shown below. The blue nodes are the same as in Lab 7. The green node is the new Tower node given to you, and the red nodes are what we will need to implement.

ROS Support for Frames and Transformations

ROS supports transformation between frames through the tf2 package, which provides very rich functionality, well beyond what is covered in this lab.

ROS also provides several standardized frames. For example, the ENU (East, North, Up) frame which is commonly used in ground robots has its positive x axis pointing to the vehicle's left, positive y pointing north, and the positive z axis pointing up. The frame we will be using in this lab is NED (North, East, Down), which is commonly used in aerial vehicles has its x pointing north, positive y axis pointing east, and has the positive z axis pointing down.

In ROS, each frame has a unique string identifier.

In this lab we will be working with a particular kind of transform that does not change over time. This kind of transforms are known as static transforms because the coordinate frames do not change over time. Our transform will aim to transform from the tower frame to the world frame. In contrast, a dynamic transform is one which changes as time goes by. For example, we would need a dynamic transform if our tower was attached to the delivery van; each time the van turned a corner the transform from the vans frame to the drone frame would also need to change.

Install tf2

In order to install the tf2 package, run the following command:

$ sudo apt-get install ros-kinetic-tf2-tools ros-kinetic-tf

Exploring the Coordinate Frames in our Simulator

Let's start by exploring what frames are already available in our simulator. A quick and easy way to do that is using the view_frames tool which will depict the system of coordinates used by a system. Start two terminals and run the following commands:

Terminal 1:

$ source ~/labsim/devel/setup.bash
$ roslaunch flightcontroller fly_path.launch

Terminal 2:

$ source ~/labsim/devel/setup.bash
$ rosrun tf2_tools view_frames.py

In the labsim directory you should have a file frames.pdf with a content similar to the image below.

This shows us that we have a small tree of transforms, rooted in the frame world with two child frames. The first child frame is the world/ned frame which is being used by the drone dynamics. This system of coordinates is always fixed. The second child frame is uav/imu frame is the body frame of the drone. The body frame is a frame which defines its origin at the center of the robot. Thus the body frame moves with the drone and it is valid only for a small time interval.

If we had a sensor on the drone they would likely be in this uav/imu frame. For example imagine we had a range finder mounted on the drone. The ranger finder may tell us that there is an object 5m ahead relative to the drone's position. That is, if the drone was facing down, or to the left the reading would mean two completely different things.

The important part to notice is that ROS provides a way to define frames, to access them, and to relate them (the labels associated with each edge will be explained as we introduce more functionality of the tf2 package.)

Defining a transformation in ROS

To support transformations ROS provides the message geometry_msgs/TransformStamped:

Using this message, a transformation is a function that transforms coordinates from the frame specified in header.frame_id to the coordinates specified in the child_frame_id.

Going back to our tree of frames in the view_frames output, the left-edge has header.frame_id set to 'world', and child_frame_id to 'world/ned'.

The transform attribute is defined as a message of type geometry_msgs/Transform, which consists of the mathematical transformation between two 3D Cartesian frames translation followed by a rotation.

The translation component is similar to a transformation in geometry and adds translation.x, translation.y and translation.z to the parents frames x, y and z coordinates.


Recall from class, that a rotation can be specified either using the Euler angles (roll, pitch, and yaw) or quaternion (essentially a four-tuple with funny multiplication). Side note: a problem with Euler angles is that the order in which rotations are applied can lead to different transformations. Thus the use of quaternions is generally preferred. Quaternions also provide many other benefits for example avoiding gimbal lock, which actually occurred during the Apollo missions. (Check out this video how a gimbal lock was narrowly avoided on the Apollo 13 mission.).

Defining our own Transformation

In this lab, we are aiming to transform the tower reported positions into the corresponding positions in the world frame. The diagram below gives specifics about the location of the tower relative to the origin of the world.

Now, using the geometry and math covered in class (Transformation lecture -- just for 2D in this case because we are ignoring altitude), develop the transformation function to do just that.

Remember that the transform, similar to a geometry_msgs/Transform message, must include a translation and a rotation. Use Euler angles for the rotation. Since we are using the NED (x points north, y points east, and z points down) convention, the rotation matrix must be adjusted to:

A positive yaw angle is one that moves from the +X axis to the +Y axis. In the case of NED frames this means that positive is clockwise.

Self-Checkpoint

To test that you have derived the correct transformation, manually transform these inputs using your formulas and check that you get the following output. Do not move on with the lab until your transformation is correct.

  • Tower [0, 10] -> World [77.7, -122.5]
  • Tower [10, 10] -> World [86.4, -127.5]
  • Tower [-10, 10] -> World [69.1, -117.5]
  • Tower [10, -10] -> World [76.4, -144.9]
  • Tower [-100, 100] -> World [36.1, 5.4]
  • Tower [-120, 80] -> World [8.8, -1.9]
  • Tower [-200, 100] -> World [-50.5, 55.4]

Note: These answers have been rounded to the first decimal point.

Broadcasting a Transformation

Let's now use that transformation as part of our system. The package tf2 provides a broadcasting mechanism through which a node sends out time stamped transformations to anyone that is listening. The idea is that all components of the system can access and often use the transforms provided by other components, and continually broadcasting them makes also sense if they are likely to change over time. A node that needs to retrieve transformations from the system, creates a listener. When the node needs to apply a transformation from frame A to frame B, it uses the listener to obtain that transform from the system and then applies it. Observe that a listener can be used to retrieve multiple transforms, while a subscriber may only receive messages on one topic.

Broadcasting setup. As mentioned earlier, in this lab we will use a static broadcaster since the transformation from tower frame to world frame is time-independent. A static broadcaster sends the transformation, and the same transformation is given to any node asking for it.

Broadcasting static transforms is so common, that the tf2 package already provides a built-in node that just needs to be configured to broadcast static transforms. To start, let's add the static_transform_publisher node to our launch fly_path.launch:

<node pkg="tf2_ros" type="static_transform_publisher" name="tower_broadcaster" args="TODO"/>

static_transform_publisher arguments must include the three components of the translation, the three angles of rotation, the parent frame id, child frame id. Replace the TODO the arguments with your computed transform in this format:

x y z yaw pitch roll frame_id child_frame_id

A few things to keep in mind:

  • In our context: z, pitch, and roll should be set to zero since we are operating in just 2D.
  • static_transform_publisher only requires the rotation angle between the frames over x and y (yaw) from which it computes the trig for us. The rotation angle must be specified in radians (not degrees) and again since our frames are in NED, a clockwise is considered a positive rotation.
  • For frame_id and child_frame_id we are converting from the tower frame to the world frame.

Once you are done, ROS offers capabilities to inspect the values sent through the system. To view the transformation you just defined use the following command after you start the simulator and confirm is what you expected:

$ rosrun tf tf_echo tower world
>>> Translation: [..., ..., ...]
>>> Rotation: in Quaternion [..., ..., ..., ...]
            in RPY (radian) [..., ..., ...]
            in RPY (degree) [..., ..., ...]

Listening and Transforming

Let's now look at how we can apply the transformation in code.

Listening setup.

To listen to the broadcasted transforms we use a TransformListener, which can be created as follows:

tfBuffer = tf2_ros.Buffer()
listener = tf2_ros.TransformListener(self.tfBuffer)

Since nodes in ROS run in their own process and their execution order is not enforced, it is possible that the transformation you need is not available when requested, so it is a good practice to enclose the lookup call for the transform in a try block as follows.

try:
  # lookup_transform arguments are target_frame, source_frame, and time
  transform = self.tfBuffer.lookup_transform('world', 'tower', rospy.Time())
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
  continue

Applying the Transform

The above code gave us the variable transform. One of the functions provided from tf2 allows to apply transforms directly to Geometry PointStamped messages as shown below:

from tf2_geometry_msgs import do_transform_point

new_point = do_transform_point(point, transform)

Where point is point in coordinates you want to transform, and transform is the transform you just looked up.

Now let's link the Tower Set Goals to the Planner

Let's now create a node that subscribes to /tower/goal (in the tower frame) and publishes a goal to our drone on the topic /uav/input/goal(in the world frame). The node will use the transform declared by the static_transform_publisher to transform the goal across the frames. To do this complete the following steps:

Step 1: Create the file tower_to_map.py in the simple_control package with the skeleton code given below.

Step 2: Add the node to the launch file.

Step 3: Implement the TODOs in the following code.

#!/usr/bin/env python
import rospy
import tf2_ros
import time
from geometry_msgs.msg import Vector3
from geometry_msgs.msg import PointStamped
from tf2_geometry_msgs import do_transform_point

class TowerToMap:

  def __init__(self):
    time.sleep(10)
    # Used by the callback for the topic /tower/goal
    self.goal = None
    # TODO: Instantiate the Buffer and TransformListener

    # TODO: Goal publisher on topic /uav/input/goal
    
    # TODO: Tower goal subscriber to topic /tower/goal
    

    # start main loop
    self.mainloop()

  #TODO: Callback for the tower goal subscriber 


  def mainloop(self):
    # Set the rate of this loop
    rate = rospy.Rate(2)
    while not rospy.is_shutdown():
      if self.goal:
        
        try: 
          #TODO: Lookup the tower to world transform

          #TODO: Convert the goal to a PointStamped

          #TODO: Use the do_transform_point function to convert the point using the transform

          #TODO: Convert the point back into a vector message containing intergers

          #TODO: Publish the vector
          rospy.loginfo(str(rospy.get_name()) + ": Publishing Transformed Goal {}".format([msg.x, msg.y]))

          # The tower will automatically send you a new goal once the drone reaches the requested position.
          #TODO: Reset the goal

        except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException):
          continue
        
      rate.sleep()


if __name__ == '__main__':
  rospy.init_node('tower_to_map')
  try:
    tom = TowerToMap()
  except rospy.ROSInterruptException:
    pass

After you have implemented this node, you are done. The tower node will publish a new goal after your drone has reached its current goal. Thus your drone should now successfully navigate the map getting new goals each time it reaches its current goal.

Final Checkpoint

1. Explain how you calculated the values for args for transform

2. Run the simulator and show that your drone is capable of reaching 3 different goals sent by the tower.

Your final run should look similar to the example shown below. Remember, that the tower publish random positions and so your positions will not be the same as the example.

Congratulations, you are done with lab 8!