Robotic hand rotates objects using touch, not vision

A team of engineers from the University of California San Diego has developed an innovative approach to enable a robotic hand to rotate objects using only touch, without relying on vision. Inspired by how humans effortlessly handle objects without seeing them, the researchers built a robotic hand with 16 touch sensors attached to its palm and fingers. These low-cost touch sensors provide simple binary signals – touch or no touch – allowing the robotic hand to smoothly rotate various objects, including small toys, cans, fruits, and vegetables, without causing damage.

The new technique contrasts with traditional methods that use a few high-cost, high-resolution touch sensors on a small area of the robotic hand, primarily at the fingertips. Such approaches have limitations, including reduced sensing ability due to the small number of sensors, difficulties in simulating and using high-resolution touch sensors in real-world experiments, and reliance on vision.

In contrast, the team’s approach demonstrates that detailed texture information about an object is not necessary for the task. Instead, simple binary signals indicating contact with the sensors are sufficient for the robotic hand to perform in-hand rotation effectively. The large coverage of binary touch sensors provides the necessary 3D structure and orientation information about the object, enabling successful rotation without the need for vision.

To train the system, the researchers conducted simulations of a virtual robotic hand rotating diverse objects with irregular shapes. The system assessed which sensors on the hand were being touched by the object during rotation, as well as the current positions and actions of the hand’s joints. Based on this information, the system instructed the robotic hand on the appropriate joint movements.

Upon testing the real-life robotic hand with previously unseen objects, such as a tomato, pepper, can of peanut butter, and a toy rubber duck, the robotic hand successfully rotated the objects without stalling or losing its grip. While objects with complex shapes took longer to rotate, the robotic hand demonstrated its ability to rotate objects around different axes.

This groundbreaking approach opens up possibilities for robots to manipulate objects in the absence of visual cues, making it a valuable advancement in the field of robotics. The team presented their work at the 2023 Robotics: Science and Systems Conference.

Posted in

Aihub Team

Leave a Comment