Low-Cost Fiducial-Based 6-axis Force-Torque Sensor

Authors: Rui Ouyang, Robert Howe

ICRA 2020 Submission (Accepted, Published)

Abstract

Commercial 6-axis force-torque sensors tend to suffer from being some combination of expensive, fragile, and hard-to-use. We propose a new fiducial-based design which addresses all three of these points. The sensor uses an inexpensive webcam and can be fabricated using a hobbyist-grade 3D printer. Designed to fit in a robotic finger, the sensor is very light and can be dropped or thrown with little concern. Furthermore, the sensor is easy to use, and can even be run in the browser without any software installation.

> Paper Link <

Supplementary Video can be found below.

What is it?

Fig 1. Consumer webcams and a printed fiducial can be used to create a six-axis force-torque sensor. We used four springs to build a platform free to move in all angular directions. We affixed a two printed fiducial markers to the platform, and then aimed a consumer camera up at the marker. To the right, a different view of the sensor reveals the tag location. The tags are glued to the light shield, which is removable. This allows for easy design changes. (Green bottle cap for size comparison only).

Link to fullsize video.

Fig 2. Video clip demonstrating Python-based sensor interface. On the left side of the screen, scrolling columns show displacement data from six axes. On the right side, on the top the webcam image is shown, and on the bottom graph of a single axis (y-axis displacement) is shown.

Conference Presentation (10 mins)

Video embedded below, or at https://www.youtube.com/watch?v=6dAHd69pE70

A bit rough due to some microphone issues -- sorry!

Slides - to the right below, or at this link

Slides


Website change log

  • 26 May 2020: Uploaded presentation video for the virtual ICRA 2020 conference (thanks to the covid 19 pandemic)

  • 6 March 2020: Uploaded revised PDF

  • 21 Jan 2020: ๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰ Accepted to ICRA (pending attendance). Added paper link to this website; revisions, as per reviews, to be done by March 3rd.

  • 7 Jan 2020: Even more improved supplementary video! (mostly added more captions, fixed some transition errors). Also improved the website (added nice new gfycat).

  • 3 Dec 2019: Uploaded the solidworks and exported STL files for the "bigger spring" version created for the video. Uploaded the exact data file and the (now commented) code used to collect & create the plots for the paper.

  • 15 Oct 2019: Improved supplementary video (better than submitted with paper) uploaded

Supplementary Video (2 mins, covered in full in the conference video)

This video is complementary to the paper results, demonstrating the design process (the paper focuses on the sensor fidelity). For full-screen, see Youtube link

7 Jan 2020. This video demonstrates how easy it is to adapt this sensor design to new applications.

  • Specifically: I designed a new sensor and fabricated it from scratch, then programmed a new application for it, in less than 9 hours (so about a full workday) (and while simultaneously recording video for the paper). Midway through I also showcase the "just open a website to get sensor readings" idea.

  • Note that since, for the video, I used an uncalibrated low-cost Monoprice printer at draft layer height and 1.5x speed settings, some of the features didn't print correctly and so it's missing bolts -- but it still works (can get relative force/torque readings).

  • If I have time, I will re-upload and the video will include the sensor reading for the high-five at the end, demonstrating the sensor can be used for more than just binary "limit switch" behavior -- just ran out of time

Files

See this dropbox folder for all files.

Notes: Hardware design files

  • The Dec 2019 zip contains the edits I made to create the sensor shown in the video. It does not contain an assembly file! since I edited the individual files directly

    • My main revisions: remove mounting tabs (used to mount sensor to Optoforce) on the base so the UR5 can grip the sensor, change the holes to match the new (wider) springs, add a 3.3v battery holder to get rid of the need for external power (hurray)

  • Previous solidworks and STL zips are from March 2019, and contain a Solidworks assembly file



Notes: Code

  • Zip includes:

    • data files used for the figures in the paper,

    • the python files used to collect data from the arucotags and the optoforce,

    • the python files to create the plots,

    • and a rough sketch of a javascript interface.


Fig 4. Screenshot from solidworks assembly

Coming soon: Documentation and bill of materials for building your own!

See video in the meantime for overview of build.

(In my opinion, it's not truly open source hardware without these details to make it easy to replicate and modify!)

This is bill of materials from paper -- the version in the video, has also a coin cell holder (with switch), and coin cell

About

For more information, contact nouyang a t g dot harvard dot edu.

Notes

(addressed in revision)

Revisions I know I need to do:

  • A few hours before submission I realized that my z-axis analysis is wrong, and needs a more refined snell's law analysis (as it turns out, objects get smaller, not larger, when they move further away...). I will need to redo this analysis.

  • Also, I've been told to avoid using "we" and make the writing more formal. I would need to rewrite this.

  • Additionally, I should not do training and evaluation on the same dataset. Need to investigate generalizing of the calibration matrix. May need to double-check calibration of the original optoforce sensor.


Additional notes:

I could put here, analyses omitted from paper due to length. Specifically, measuring framerate of cheap webcams

Also, ideally the sensitivity vs max force range tradeoff would be a bit better.


Thanks

I received proof-reading and co-working help from many people not listed on the paper! In no particular order:

  • M. Rodriguez, G. Izatt, I. Tolkova, E. Lu, A. Houghstow (stayed up with me until 3 AM!), N. Kirkby, O. Biza, (All the csail-related@ folks who answered my Qs), Helping Hands Lab (for robot arm in video),