Robotics

TEGA: A Tactile-Enhanced Grasping Assistant for Assistive Robotics via Sensor Fusion and Closed-Loop Haptic Feedback

The system fuses EMG signals and visuotactile data to provide real-time vibrotactile feedback via a wearable vest.

Deep Dive

A research team led by Hengxu You, Tianyu Zhou, and Jing Du from the University of Texas at Austin has developed TEGA (Tactile-Enhanced Grasping Assistant), a novel framework for assistive robotics. The system directly addresses a critical gap in robotic teleoperation: the lack of intuitive force modulation. While most systems focus on precise finger positioning, TEGA introduces a closed-loop that fuses electromyography (EMG) signals—which infer a user's intended grasp force—with visuotactile data from the robot's sensors. This multi-modal data is then translated into proportional, real-time feedback.

This feedback is delivered through a wearable haptic vest that provides vibrotactile cues to the user, creating an intuitive sense of touch. This allows operators, especially individuals with upper limb disabilities who lack natural tactile feedback, to dynamically refine the amount of force a robotic hand applies. User studies confirmed the system substantially improves grasp stability and task success when manipulating objects with diverse hardness, texture, and shape. The research, accepted for presentation at the 2026 IEEE International Conference on Robotics and Automation (ICRA), represents a significant step toward more dexterous and responsive assistive devices.

Key Points
  • Fuses EMG-based intent inference with robot visuotactile sensing for closed-loop control.
  • Delivers real-time, proportional force feedback to the user via a wearable vibrotactile haptic vest.
  • User studies show it substantially improves grasp stability and task success for assistive applications.

Why It Matters

Enables intuitive, fine-grained control of robotic prosthetics, helping users perform delicate daily tasks they previously couldn't.