Tac-VGNN: A Voronoi Graph Neural Network
for Pose-Based Tactile Servoing
Tac-VGNN: A Voronoi Graph Neural Network
for Pose-Based Tactile Servoing
WEN FAN, Max Yang, Yifan Xing, Nathan Lepora, Dandan Zhang
Tactile pose estimation and tactile servoing are fundamental capabilities of robot touch. Reliable and precise pose estimation can be provided by applying deep learning models to high-resolution optical tactile sensors. Given the recent successes of Graph Neural Network (GNN) and the effectiveness of Voronoi features, we developed a Tactile Voronoi Graph Neural Network (Tac-VGNN) to achieve reliable pose-based tactile servoing relying on a biomimetic optical tactile sensor (TacTip). The GNN is well suited to modelling the distribution relationship between shear motions of the tactile markers, while the Voronoi diagram supplements this with area-based tactile features related to contact depth. The experiment results show that Tac-VGNN model can help enhance data interpretability during graph generation and model training efficiency significantly than CNN-based methods. The proposed Tac-VGNN improved pose estimation accuracy along vertical depth by 28.57% over vanilla GNN without Voronoi features and also achieved better performance on the real surface following tasks with smoother robot control trajectories.
A brief introduction video for Tac-VGNN project, including experiment demo and results.
Tactile perception is needed for robots to understand the objects they manipulate and the surrounding environment they interact with. Similar to visual servoing where a robot controls the pose of a camera relative to features of the object image, tactile servoing control likewise changes the pose of a tactile sensor in physical contact with an object based on the touch information. We applied a tactile robotic system comprising a low-cost desktop robot arm (Dobot MG400) and a tactile sensor (TacTip) for surface following, which can be considered a tactile servoing task. During the tactile servoing process, the contact between the tactile sensor and object is changing continuously, through which the surface of the unknown target can be explored.
If regarded the TacTip pins as vertices, the localization of each pin and the relative distance between any two pins will vary with the skin deformation. This principle is highly similar to the definition of node and edge in a graph, which is a non-linear data structure. Delaunay triangulation was hired to support the graph construction of tactile images. Then Voronoi diagram features were also used to enrich the information of such tactile graph data. Since the cells generated by the Voronoi tessellation can provide a surrogate of depth information for TacTip sensors.
The 3D heat map plot of Voronoi graph data shown can represent the tessellation area distribution and contact information. In the non-contact case, there is a natural pin density reduction from the centre to the edge of the sensor, as seen from the gradual change in colour. When deformation occurs, the pin density at the contact centre decreases. The difference between the two Voronoi graphs can show exactly the location of the contact centre and the depth of the deformation that has occurred, which provides valuable input for servoing pose predictions.
A Graph Convolutional Network (GCN) is introduced for feature extraction, whose input is a Voronoi graph G(X, E). The final output consists of the predictions for vertical translation movement along axis Y and rotation angle Roll along axis Z. It can beyond traditional CNNs and GNNs with better model training efficiency, higher pose estimation accuracy and smoother servoing performance.
Five experiments were conducted using five types of objects for the surface following task. To quantify the tactile servoing performance, we defined an evaluation metric called smoothness whose value is smaller, the trajectory will be smoother. From our experiments, the Tac-VGNN performed very well with less fluctuations and provides more accurate pose estimates as compared to the other models.
Contact [fanwen2021@gmail.com] to get more information about the project