Detecting human movement intentions is fundamental to neural control of robotic exoskeletons, as it is essential for achieving seamless transitions between different locomotion modes. In this study, we enhanced a muscle synergy-inspired method of locomotion mode identification by fusing the electromyography data with two types of data from wearable sensors (inertial measurement units), namely linear acceleration and angular velocity. From the finite state machine perspective, the enhanced method was used to systematically identify 2 static modes, 7 dynamic modes, and 27 transitions among them. In addition to the five broadly studied modes (level ground walking, ramps ascent/descent, stairs ascent/descent), we identified the transition between different walking speeds and modes of ramp walking at different inclination angles. Seven combinations of sensor fusion were conducted, on experimental data from 8 able-bodied adult subjects, and their classification accuracy and prediction time were compared. Prediction based on a fusion of electromyography and gyroscope (angular velocity) data predicted transitions earlier and with higher accuracy. All transitions and modes were identified with a total average classification accuracy of 94.5% with fused sensor data. For nearly all transitions, we were able to predict the next locomotion mode 300-500 ms prior to the step into that mode.
Sign-in or become an IEEE member to discover the full contents of the paper.