: Learning a Hand Model from Dynamic Movements Using High-Density EMG and Convolutional Neural Networks

Learning a Hand Model from Dynamic Movements Using High-Density EMG and Convolutional Neural Networks

Learning a Hand Model from Dynamic Movements Using High-Density EMG and Convolutional Neural Networks 750 422 IEEE Transactions on Biomedical Engineering (TBME)
Author(s): Raul C. Sîmpetru, Andreas Arkudas, Dominik I. Braun, Marius Osswald, Daniela Souza de Oliveira, Bjoern Eskofier, Thomas M. Kinfe, Alessandro Del Vecchio

Our study presents a novel deep learning model that decodes the electrical activity of forearm muscles into precise hand movements, offering significant potential to enhance assistive device control for individuals with motor impairments. By using 320 surface electromyography (sEMG) sensors on the forearm, we recorded muscle activity from 13 healthy participants executing a variety of single-digit and compound hand movements, covering 22 degrees of freedom at both slow (0.5 Hz) and comfortable (1.5 Hz) speeds. The model accurately translates these muscle signals into hand kinematics and kinetics, outperforming existing decoding methods.

Our results demonstrate that the model reliably predicts both movement and force output (kinematics and kinetics) across individual finger movements and compound actions. Through its ability to map muscle activity to specific hand functions, the model produces distinct neural patterns, or “neural embeddings,” for each unique movement. For instance, it differentiates between individual finger and multi-digit movements, even when performed at varying speeds. These neural embeddings consistently align with the anatomy of the hand, indicating that the model has effectively learned a representation of the hand’s biomechanical structure and adapts accordingly to complex movement patterns.

Unlike conventional methods that employ low-pass filtering of sEMG signals, our approach leverages the full-bandwidth EMG data, capturing a broader spectrum of signal information. This comprehensive approach allows for a more detailed and precise interpretation of the muscle signals underlying hand movements. These findings hold promise for the development of more responsive, intuitive assistive devices, potentially improving quality of life for users by providing a robust, user-adaptive control interface.

Access the Full Paper on IEEE Xplore®

Sign-in or become an IEEE member to discover the full contents of the paper.