Abstract
Tactile sensing is critical for enhancing manipulation precision and versatility, but its adoption in robotic hands remains limited due to high sensor costs, manufacturing and integration challenges, and difficulties in extracting expressive and reliable contact information.
In this work, we present a low-cost, easy-to-make, adaptable, and compact fingertip design for robotic hands that integrates multi-modal tactile sensors. We use strain gauge sensors to capture static forces and a contact microphone sensor to measure high-frequency vibrations during contact. These tactile sensors are integrated into a compact design with a minimal sensor footprint. From sensor characterization, we show that strain gauge sensors provide repeatable 3D force measurements in the 0–5 N range and the contact microphone sensor has the capability to captures distinct vibrotactile signatures associated with different material types. We evaluate our design with three types of manipulation tasks, including slip-aware closed-loop grasping to handle a wide range of objects, cup counting and unstacking with precise contact localization, and box selection and opening with hidden material identification. Given the expressiveness and reliability of tactile sensor readings, we show that different tactile sensing modalities can be used flexibly in manipulation, solely or together to achieve improved task performance
Fingertip Design
The design CAD files can be found here.
Fingertip structure includes a rigid bone and a fingernail, and is covered by a soft skin, which mimics human fingertip’s mechanical structure. We leverage strain gauge sensors to emulate the role of slow adapting (SA) mechanoreceptors, responsible for detecting pressure, and a contact microphone sensor to replicate the function of fast adapting (FA) mechanoreceptors, to pick up contact vibrations and infer dynamic contact events such as slip
Fingertip Fabrication
The fabrication procedure and final prototype of the fingertip. (a) Fingertip fabrication: 1. Soldering the connector onto the PCB; 2. attaching the contact microphone sensor on the top of the 3D-printed cap using super glue; 3. gluing strain gauge sensors on four face of the square prism which is 3D-printed together with the fingertip base; 4. mounting the PCB with M2 screws beneath the fingertip base; 5. gluing the 3D-printed cap on top of the base; 6. applying a layer of foam on the cap; 7. attaching a layer of the grip tape as finger skin. (b) A fully assembled fingertip prototype.
Readout Circuit
Readout circuit of the fingertip: the contact microphone sensor signals and the strain gauge sensor signals are collected through the fingertip PCB and carried by an FFC cable to another custom PCB. The strain gauge sensor measurements are amplified and digitized with an HX711 module on an Arduino Uno, while the vibrotactile signals are pre-amplified, digitized by a modified Maono USB sound card. Both are transmitted to the PC via USB.
Bill of Materials (per fingertip)
For each fingrtip, we list the bill of materials below, and it costs 90.34 USD in total.
14.99
24.29
46.0
3.44
0.61
0.61
0.08
0.32
~0.0
~0.0
~0.0
Sensor Characterization
Tactile sensor characterization setup and results: (a) The strain gauge sensor characterization setup. A human randomly interacts with the fingertip while recording the strain gauge sensor measurements and ground truth force readings from a force/torque sensor. (b) Contact microphone sensor characterization setup. We use a custom end-effector with samples of seven materials attached on different faces. The UR5e arm is controlled to initiate sliding contacts between the fingernail and a sample to generate vibrotactile signals. (c) Comparison between predicted and ground truth 3D forces. (d) Material classification error matrix based on vibrotactile measurements. (e) Contact microphone's sensitivity to the noise from Delta finger's linear actuator.
Experiments
Overview
We evaluated the proposed tactile fingertips across three manipulation tasks: i) pinching fragile objects with fingertip force control, ii) counting and unstacking paper cups, and iii) detecting the material of hidden objects through shaking to guide subsequent manipulation. The first task examines whether fingertip tactile signals are reliable enough to serve directly as feedback for closing the control loop. The second investigates whether tactile sensing can serve as an alternative to vision under occlusion to improve manipulation performance. The third evaluates the sensitivity and robustness of tactile sensing for material recognition and its effectiveness in informing downstream manipulation.
Adaptive grasping with slip detection
a) Object set that is used to for adaptive grasping task. We select objects with various properties that are challenging to grasp, such as deformable, fragile, irregular objects. (b) We demonstrate the success of grasping these objects by using force control to initiate the grasp and vibrotactile signals to detect the slip and adjust the grasp strength. Minimal normal forces as labeled in the figure are applied to lift objects without damage.
Eval across a test sets of 10 with object having different shape, weight, stiffness, the overall sucess rate is 96%
Cup counting and unstacking
Cup localization and unstacking. (a) The hand starts to align with the cup stack; (b) A finger extends vertically; (c) It steps in horizontally to initiate the contact until a force threshold reached; (d) The finger resets above the cup stack; (e) It slides vertically through edges along the stack; (f) Detect cup edges from vibrotactile features, move a pair of fingers to the height of the first cup edge, and close until a force threshold 0.5 N is reached; (g) Separate and lift the top cup. (h) Using the step-in distance and the force readings to localize the cup edges. (i) Raw vibrotactile signals are binarized and synchronized with the finger positions to detect the contacts with cup edges. (j) Visualization of transparent cups and paper cups. (k) Evaluation results for the tactile-based and vision-based methods.
Unstacking transparent cup
Unstacking paper cup
Hidden object detection through shaking
Box selection and opening with occluded in-box object material identification. Setup: (a) Four visually identical boxes contain many screws, few screws, rubber bands, and a mixture of screws and rubber bands, representing contents with different stiffness and weight. Pipeline: (b) The robot grasps each box and records the finger z-axis force Fz_pinch. (c) The robot then lifts the box, holds it for 2~s, and records Fz_lifted. (d) The robot shakes the box while recording vibrotactile signals. (e) Based on the predicted material class, the robot selects the target box and opens it. Results (f) Confusion matrices for force-only, vibrotactile-only, and fused-input classification. (g) The classifier uses the shaking vibrotactile signals and the force difference before and after lifting to predict the hidden material class.