This thesis investigates a deep-learning framework for predicting the 3D position of the human hand (wrist) using surface electromyography (sEMG) signals collected exclusively from the upper arm and shoulder muscles. Unlike traditional approaches that rely heavily on forearm muscle activity for hand-motion decoding, this project explores whether the proximal muscles—such as the deltoid and rotator cuff—contain sufficient neuromuscular information to infer distal hand movements. This represents a critical problem in neuromuscular modeling, particularly for prosthetic control in transhumoral amputees, where forearm muscles are no longer available.
Upper-arm and shoulder muscle activity plays a crucial role in stabilizing and positioning the limb during reaching, lifting, manipulation, and coordinated whole-arm actions. These proximal muscles encode motion intention and limb trajectory through activation patterns that reflect:
-
Shoulder joint rotations (flexion/extension, abduction/adduction, internal/external rotation)
-
Elbow movement synergy and co-contraction
-
Predictive motor planning signals for hand positioning
Understanding how these signals relate to hand position is fundamentally important for prosthetic design, rehabilitation robotics, and human–robot interaction, especially in cases where distal musculature cannot be accessed.
This research has strong relevance to:
-
Upper-limb prosthetics, enabling intuitive control using remaining shoulder muscles
-
Rehabilitation robotics, supporting patients with forearm or hand impairments
-
Human–robot collaboration, enabling smooth arm–hand trajectory decoding
-
Muscle-driven interfaces, bridging biological signals and robotic actuation
By focusing on proximal muscles only, this project addresses one of the most challenging and clinically important questions in neuromuscular robotics:
Can the wrist or hand motion be tracked accurately using only the shoulder and upper-arm muscles?