A team at Tsinghua University has now demonstrated a sensory control synergy approach that allows robots to learn human like grasping skills from a small set of human examples. The work, reported in National Science Review, introduces a framework that captures human grasping experience and transfers it directly into robotic hands equipped with tactile sensors.
The researchers first developed a tactile glove fitted with custom tactile sensors at the fingertips to record the process of human grasping. Worn on a human hand, the glove collects multimodal tactile data describing contact, slip, and pressure while the person grasps and manipulates objects.
Inspired by human neurocognition, the team then devised a strategy to convert these raw measurements into higher level semantic grasping states such as stable, slightly unstable, or highly unstable. This encoding step filters out uninformative variations caused by differences in grasp position or hand posture between individual trials, making the representation more universal across different objects and users.
Instead of training robots on large datasets of precise tactile readings for each object, the system teaches robots to recognize and respond to these generalized interaction states. According to corresponding author Prof Rong Zhu, this focus on state level understanding makes the method data efficient and highly transferable between tasks and objects.
To turn state recognition into action, the researchers implemented a fuzzy logic controller that mimics human decision making experience. When the tactile system detects a highly unstable state, the controller increases grip force, while a stable state leads the robot to maintain its current grasp without over tightening.
Once the human sensory control synergy was built, the team transferred it into a robotic hand outfitted with its own tactile sensors. With this transferred framework, the robot demonstrated adaptive grasping of diverse objects, including a slippery umbrella, a fragile raw egg, and a heavy bottle, and generalized well to unfamiliar items not included in the initial training.
The human inspired framework can be established efficiently using small datasets collected from a single person, avoiding the need for extensive multi user or multi object training campaigns. Experimental tests showed that a robot equipped with this transferred synergy achieved an average grasping success rate of 95.2 percent when handling slippery, fragile, soft, and heavy objects.
In dynamic experiments, the robot sensed external disturbances such as pulling forces and responded by autonomously increasing its grip to prevent slips. These results underscore the potential of tactile based control to improve robot robustness when interacting with uncertain or changing environments.
The team also demonstrated a real world application in which a robot completed the multi step task of hand brewing coffee. From locating items and scooping coffee powder to stirring and serving, the robot relied on its tactile feedback and sensory control logic to manage uncertainties at each step.
Prof Zhu explained that the robots in this study learn universal grasping by understanding the sensory and control logic underlying tactile data rather than by copying detailed human motion trajectories. This emphasis on underlying principles effectively gives robots the ability to draw inferences from one instance and extend what they have learned to new situations.
According to the researchers, this human like sensory control synergy offers a promising route to robots that can handle a wide range of objects with minimal retraining. By making it possible for robots to absorb and reuse human experience, the approach could accelerate the deployment of intelligent robotic systems in real world service, industrial, and domestic scenarios.
The work was carried out by the research group led by Prof Rong Zhu at the State Key Laboratory of Precision Measurement Technology and Instrument, Department of Precision Instrument, Tsinghua University. The study highlights how combining bio inspired sensing strategies with experience based control can narrow the gap between human and robotic dexterity.
Research Report: Human-Taught Sensory-Control Synergy for Universal Robotic Grasping
Related Links
Tsinghua University
All about the robots on Earth and beyond!
| Subscribe Free To Our Daily Newsletters |
| Subscribe Free To Our Daily Newsletters |