Researchers develop low-cost sensor to enhance robots’ sense of touch

Researchers from Queen Mary University of London, in collaboration with teams from China and the USA, have developed an innovative L3 F-TOUCH sensor to enhance the tactile capabilities of robots. This sensor enables robots to “feel” objects and adjust their grip accordingly, a crucial step towards achieving human-level dexterity in manipulation and grasping.

The study, published in IEEE Robotics and Automation Letters, highlights the importance of reliable tactile information and force sensing in robot hands and graspers. The human hand possesses the ability to sense pressure, temperature, texture, and pain, as well as distinguish objects based on their shape, size, and weight. In contrast, many existing robot hands lack integrated haptic capabilities, making it challenging for them to handle objects effectively. Without a sense of touch and knowledge about the interaction forces, robots may struggle to maintain a secure grip on objects or handle delicate items without causing damage.

Led by Professor Kaspar Althoefer, the team presents the L3 F-TOUCH sensor, which stands for Lightweight, Low-cost, and wireLess communication. This high-resolution fingertip sensor offers a solution for measuring an object’s geometry and determining the forces required for interaction. Unlike other sensors that estimate interaction forces using camera images of soft elastomer deformation, the L3 F-TOUCH directly measures interaction forces through an integrated mechanical suspension structure with a mirror system, achieving higher accuracy and a wider measurement range.

The compact suspension structure allows the elastomer to deform upon contact with the surface, measuring high-resolution contact geometry exposed to external forces. The elastomer’s displacement is tracked using a special marker, enabling the measurement of contact forces along three major axes (x, y, and z) through a calibration process.

Professor Althoefer emphasizes that future work will focus on extending the sensor’s capabilities to measure rotational forces, such as twist, while remaining accurate and compact. This advancement will enable robots to have a more comprehensive sense of touch and improve their performance in manipulation tasks, including human-robot interaction scenarios like patient rehabilitation or assisting the elderly.

The breakthrough achieved with the L3 F-TOUCH sensor holds significant potential for the future of robotics. By providing robots with a sense of touch, they can handle objects more effectively and perform complex manipulation tasks with greater precision and reliability. This technology opens up new possibilities for more advanced and agile robots, ushering in a new era of robotic capabilities.

Posted in

Aihub Team

Leave a Comment





SK Telecom outlines its plans with AI partners

SK Telecom outlines its plans with AI partners

Razer and ClearBot are using AI and robotics to clean the oceans

Razer and ClearBot are using AI and robotics to clean the oceans

NHS receives AI fund to improve healthcare efficiency

NHS receives AI fund to improve healthcare efficiency

National Robotarium pioneers AI and telepresence robotic tech for remote health consultations

National Robotarium pioneers AI and telepresence robotic tech for remote health consultations

IBM’s AI-powered Mayflower ship crosses the Atlantic

IBM’s AI-powered Mayflower ship crosses the Atlantic

Humans are still beating AIs at drone racing

Humans are still beating AIs at drone racing

How artificial intelligence is dividing the world of work

How artificial intelligence is dividing the world of work

Global push to regulate artificial intelligence

Global push to regulate artificial intelligence

Georgia State researchers design artificial vision device for microrobots

Georgia State researchers design artificial vision device for microrobots

European Parliament adopts AI Act position

European Parliament adopts AI Act position

Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA

Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA

AI Day: Elon Musk unveils ‘friendly’ humanoid robot Tesla Bot

AI Day: Elon Musk unveils ‘friendly’ humanoid robot Tesla Bot

AI and Human-Computer Interaction: AI technologies for improving user interfaces, natural language interfaces, and gesture recognition.

AI and Data Privacy: Balancing AI advancements with privacy concerns and techniques for privacy-preserving AI.

AI and Virtual Assistants: AI-driven virtual assistants, chatbots, and voice assistants for personalized user interactions.

AI and Business Process Automation: AI-powered automation of repetitive tasks and decision-making in business processes.

AI and Social Media: AI algorithms for content recommendation, sentiment analysis, and social network analysis.

AI for Environmental Monitoring: AI applications in monitoring and protecting the environment, including wildlife tracking and climate modeling.

AI in Cybersecurity: AI systems for threat detection, anomaly detection, and intelligent security analysis.

AI in Gaming: The use of AI techniques in game development, character behavior, and procedural content generation.

AI in Autonomous Vehicles: AI technologies powering self-driving cars and intelligent transportation systems.

AI Ethics: Ethical considerations and guidelines for the responsible development and use of AI systems.

AI in Education: AI-based systems for personalized learning, adaptive assessments, and intelligent tutoring.

AI in Finance: The use of AI algorithms for fraud detection, risk assessment, trading, and portfolio management in the financial sector.

AI in Healthcare: Applications of AI in medical diagnosis, drug discovery, patient monitoring, and personalized medicine.

Robotics: The integration of AI and robotics, enabling machines to perform physical tasks autonomously.

Explainable AI: Techniques and methods for making AI systems more transparent and interpretable

Reinforcement Learning: AI agents that learn through trial and error by interacting with an environment

Computer Vision: AI systems capable of interpreting and understanding visual data.

Natural Language Processing: AI techniques for understanding and processing human language.