Learning to estimate the proximity of slip using high-resolution tactile sensing
More Info
expand_more
Abstract
Tactile sensing provides crucial information about the stability of a grasped object by a robotic gripper. Tactile feedback can be used to predict slip, allowing for timely response to perturbations and to avoid dropping objects. Tactile sensors, included in robotic grippers, measure vibrations, strain or shearing forces which are produced by the movement of the grasped object. With sufficient spatial resolution, tactile sensors can even classify slip or estimate the 3d force displacement field. However, current tactile sensors fail to preemptively detect slippage, requiring fast reaction times during applications in real-time control. Here we show a perception framework that can predict slippage before it occurs by estimating the frictional safety margin. The safety margin indicates the margin to the frictional strength of a grasp, which decreases for reduced friction or increased load force. An accurate safety margin estimate allows for more efficient robot grip force control while providing robustness against object uncertainty and frictional conditions. We developed a high resolution tactile sensor, on which we trained a convolutional neural network to learn the relationship between tactile images and the safety margin. The network’s performance is evaluated on unseen test data, showing robustness to variations in environmental conditions. The results demonstrate that the tactile images contain the information needed to produce accurate safety margin estimates. These estimates can be used for control up to 20% of the minimum required grip force, mimicking human grasping behavior. This approach can drive new grasp control methods and enable robotic grasping of fragile objects in highly dynamic environments. Applications can be found in harvesting, parcel sorting, or improving human-robot interaction.