Teleoperations with Baxter

Collaborators: Joyce Moon, Joyce Wang, Ritika Srivastava, Corey Zhou, Alex Hong

General Topic Outline: Our group has decided to work on teleoperations with the Baxter robot in which we will be developing a controller to move Baxter. We can use Leap Motion to get the movement of the hand, and the robot can consequently move its arms to move the way our hand moved. Baxter robot can automatically adjust the angles of the joint to locate its 0hand in certain position. So, only with information of where the hand should be, it can perform the movement without the information of how to move provided (angle,etc). So leap motion can give us accurate location in 3d space and if we send that information to the robot, it can process it.


    • Interest in VR/AR
    • Teleoperation is a fun topic
    • Allows for us to interact with the robot from a distance, (in the future this can be expanded to various fields)


    • We use Leap Motion to get the 3D movement of the hand, and the robot will move based on the hand movement. Baxter robot can automatically adjust the angles of the joint to locate its hand in certain position.


    • To control Baxter robot more intuitively and from a distance

Shopping List:

  • 1 Leap motion controller (need it for capturing hand movements)

  • 1 Google cardboard (need it because it is a vr-headset)

  • 1 Dual lens camera (to see what the robots sees so we can control it from a distance)

  • 1 Kinect (baxter has 7 deg of freedom which can be further explored with the connect since leap motion only accounts for 6 of them)

Project Plan

Project Plan:

  1. Everybody learns about leap motion and baxter programming
  2. Hello Baxter Tutorial (link)
  3. Program in python
  4. Leap Motion API overview (link)
  5. General intro video from CS50 (link)
  6. Split into the human team (leap motion hand motion capturing, convert info to baxter) and the baxter team (process baxter instructions, dual lens camera related-stuff)
  7. Get leap motion
  8. Find machines for baxter programming
  9. (updates weekly on website)
  10. Translate coordinate systems from Leap Motion to Baxter
  11. Design leap motion visualization system
  12. Design a set of instructions, mapping from hand motions to Baxter movements
    1. What do we want baxter do?
    2. How does user control camera vision
  13. Hook leap motion program up with baxter


We started off with getting leap motion to work in an appropriate manner. It took a lot of trial, error, and experimentation before we got it to wok.

Virtual Reality Aspect: Camera on Ceiling


Early Attempts


Later Attempts, Baxter is able to recognize motion much better than before.


Final Results: Multiple perspectives


  • The main idea in order to execute this project was to translate the leap motion data to baxter instructions
  • We used this conversion guide in order to help us with this process

conversion graph.pdf

Preprocessing raw Leap Motion data

    • Grab data
      • hand location (x, y, z)
      • hand orientation (roll, pitch, yaw)
      • pinch strength
    • Convert coordinate system (coodinates & orientation)
      • leap (x+, y+, z+) → ROS (y-, z, x-)
    • Communicate with ROS via Tornado websockets

Displacement calculation

    • Compare most recent position and orientation data with reference points
    • Calculate and scale the difference (displacement)
    • Determine the next position for Baxter to move to

RPI Livestreaming -> Local Computer -> Phone

    • RPI streams 2 live-videos (left & right camera)
    • Computer receives and sends html through local server
      • CSS to format Place the live video side by side.
    • Phone placed into Google Cardboard
    • Tweaked around camera resolution to decrease the lag live-streaming.
    • Placed the “eyes” up to ceiling for 3rd person view
      • Narrow visual angle

Final Thoughts on Project:

We were all exposed to various technologies we had not dealt with before.

  • More accurate motion mirroring in terms of both location and speed
  • Smoother baxter
  • Having two 360 camera to enable wider vision and more immersive experience in Virtual Reality.
  • Include Leap Motion hand visualization in VR, so that user can be more aware of their hands in the space
  • Integration with Kinect as to account for the 7th degree of freedom

Final Presentation Deck