TY - JOUR
T1 - Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves
AU - Tashakori, Arvin
AU - Jiang, Zenan
AU - Servati, Amir
AU - Soltanian, Saeid
AU - Narayana, Harishkumar
AU - Le, Katherine
AU - Nakayama, Caroline
AU - Yang, Chieh ling
AU - Wang, Z. Jane
AU - Eng, Janice J.
AU - Servati, Peyman
N1 - Publisher Copyright:
© 2024, The Author(s), under exclusive licence to Springer Nature Limited.
PY - 2024/1
Y1 - 2024/1
N2 - Accurate real-time tracking of dexterous hand movements has numerous applications in human–computer interaction, the metaverse, robotics and tele-health. Capturing realistic hand movements is challenging because of the large number of articulations and degrees of freedom. Here we report accurate and dynamic tracking of articulated hand and finger movements using stretchable, washable smart gloves with embedded helical sensor yarns and inertial measurement units. The sensor yarns have a high dynamic range, responding to strains as low as 0.005% and as high as 155%, and show stability during extensive use and washing cycles. We use multi-stage machine learning to report average joint-angle estimation root mean square errors of 1.21° and 1.45° for intra- and inter-participant cross-validation, respectively, matching the accuracy of costly motion-capture cameras without occlusion or field-of-view limitations. We report a data augmentation technique that enhances robustness to noise and variations of sensors. We demonstrate accurate tracking of dexterous hand movements during object interactions, opening new avenues of applications, including accurate typing on a mock paper keyboard, recognition of complex dynamic and static gestures adapted from American Sign Language, and object identification.
AB - Accurate real-time tracking of dexterous hand movements has numerous applications in human–computer interaction, the metaverse, robotics and tele-health. Capturing realistic hand movements is challenging because of the large number of articulations and degrees of freedom. Here we report accurate and dynamic tracking of articulated hand and finger movements using stretchable, washable smart gloves with embedded helical sensor yarns and inertial measurement units. The sensor yarns have a high dynamic range, responding to strains as low as 0.005% and as high as 155%, and show stability during extensive use and washing cycles. We use multi-stage machine learning to report average joint-angle estimation root mean square errors of 1.21° and 1.45° for intra- and inter-participant cross-validation, respectively, matching the accuracy of costly motion-capture cameras without occlusion or field-of-view limitations. We report a data augmentation technique that enhances robustness to noise and variations of sensors. We demonstrate accurate tracking of dexterous hand movements during object interactions, opening new avenues of applications, including accurate typing on a mock paper keyboard, recognition of complex dynamic and static gestures adapted from American Sign Language, and object identification.
UR - http://www.scopus.com/inward/record.url?scp=85182228738&partnerID=8YFLogxK
U2 - 10.1038/s42256-023-00780-9
DO - 10.1038/s42256-023-00780-9
M3 - 文章
AN - SCOPUS:85182228738
SN - 2522-5839
VL - 6
SP - 106
EP - 118
JO - Nature Machine Intelligence
JF - Nature Machine Intelligence
IS - 1
ER -