UVtac: Switchable UV Marker-Based Tactile Sensing Finger for Effective Force Estimation and Object Localization
Vision-based tactile sensors provide diverse information of external tactile stimuli on the skin of sensors using both marker and reflective membrane images. However, when markers and reflective membranes are used concurrently, conventional opaque markers inevitably disturb the camera’s view of the reflective membrane. Thus, simultaneously increasing the quality of tactile information extracted from each visual feature has remained a challenge. In this study, we present a tactile sensing finger, UVtac, that utilizes switchable ultraviolet (UV) markers to decouple the marker and reflective membrane images to offer three-axis force estimation and object localization, whose performances are unaffected by each other. Our UVtac showed improved force estimation performance by using larger-sized UV markers through quantitative evaluation. The UVtac with 1.2 mm diameter markers showed a root mean square error of 0.264 N in estimating normal forces up to 10 N and 0.219 N in estimating the shear forces up to 5 N, when indented with an 8 × 8 mm2 square tooltip. Based on the object localization experiment, the UVtac was verified to have a 31 % lower root mean square error than the case using opaque black markers. Finally, we demonstrated object alignment and contact force-tracking tasks using the UVtac to emphasize its multifunctionality.
Related paper [2022 RA-L]
Tactile Event Based Grasping Algorithm using Memorized Triggers and Mechanoreceptive Sensors
Humans perform grasping by breaking down the task into a series of action phases, where the transitions between the action phases are based on the comparison between the predicted tactile events and the actual tactile events. The dependency on tactile sensation in grasping allows humans to grasp objects without the need to locate the object precisely, which is a feature desirable in robot grasping to successfully grasp objects when there are uncertainties in localizing the target object. In this paper, we propose a method of implementing a tactile event based grasping algorithm using memorized predicted tactile events as state transition triggers, inspired by the human grasping. First, a simulated robotic manipulator mounted with pressure and vibration sensors on each finger, analogous to the different mechanoreceptors in humans, performed ideal grasping tasks, from which the tactile signals between consecutive states were extracted. The extracted tactile signals were processed and stored as predicted tactile events. Secondly, a grasping algorithm composed of eight discrete states, Reach, Re-Reach, Load, Lift, Hold, Avoid, Place, and Unload was built. The transition between consecutive states is triggered when the actual tactile events match the predicted tactile events, otherwise, triggering the corrective actions. Our algorithm was implemented on an actual robot, equipped with capacitive and piezoelectric transducers on the fingertips. Lastly, grasping experiments were conducted, where the target objects were deliberately misplaced from their expected positions, to investigate the robustness of the tactile event based grasping algorithm to object localization errors.
Related paper [IROS 2020]
Aided-Grasping in Teleoperation with Multiple Unknown Objects
An interactive potential field proposed in the study provides potential strength changes in real-time reaction to the user’s intention. When the user guides a path containing an object, the robot avoids and detours to prevent an unintended collision. However, when a human makes a target point at the position of an object, the system detects the user intends to make contact with that object and changes the repulsive field to solve the goal-not-reachable problem. In the figure, when the robot first moves to the target object, the object between the path was assumed as an obstacle to avoid. When doing a replacing job, the robot hand makes contact with the object to push it away according to the user’s intention.