New AI wearable patch that can recognize gestures in motion to control a robotic arm
Recently, Xu Sheng's team developed a wearable sensor enhanced by deep learning to build a human-machine interaction interface with strong anti-motion artifacts capabilities under a variety of dynamic conditions. The device is attached to the user's forearm in the form of a flexible electronic patch, which wirelessly captures motion signals in real time and drives an external robot to perform precise operations.
To put it simply, it is like a general "gesture interpreter", even in highly dynamic environments such as running, high-frequency vibration, swimming or moving cars, it can capture and transmit gesture signals wirelessly, and convert the extracted gesture signals into real-time, continuous basic control robot arm movements. In the face of some relatively complex movements, these basic movements can be combined step by step like building blocks in Lego.
Experimental results show that the system achieved a series of excellent performance: gesture recognition accuracy of 94%; new users only need to sit and lie two samples to complete the model fine-tuning, greatly reducing the data collection time; from the gesture signal capture to the robot arm response delay of about 1.3 seconds, can meet the needs of real-time control; battery performance, stretchable battery capacity after 60 cycles to maintain about 25 mAh, coulomb efficiency of nearly 100%, can support more than 4 hours of continuous operation of the device.