This bachelor thesis focuses on using surface electromyography (sEMG) signals from the forearm to classify human hand gestures and finger postures. sEMG provides a non-invasive measurement of muscle activation and is widely used in prosthetic control and human–machine interfaces. In this project, multiple wet electrodes will be placed around the forearm to capture muscle activity associated with different hand and finger configurations.
The goal is to design a machine-learning pipeline that processes sEMG signals and classifies them into a set of predefined hand gestures or finger postures (e.g., fist, pinch, point, open hand, individual finger flexion/extension). Tasks include signal preprocessing, feature extraction, classifier training, and performance evaluation.
Reliable gesture classification forms the foundation for intuitive prosthesis control and interactive robotic systems. This thesis provides essential insights into the relationship between forearm muscle patterns and hand configurations, enabling more natural human–robot communication.