The demand for limb motion capture is growing rapidly across entertainment, rehabilitation, and human–computer interaction domains. Portable, occlusion‐free inertial measurement units (IMUs) hold promise for fine‐grained tracking, yet traditional physics‐based approaches remain vulnerable to noise accumulation and integration drift.
This study explores the possibility of limb tracking using IMU and computer vision to better improve the motion tracking capability.
The outcome of this project could facilitate the integration of inertial and visual data for motion tracking, also potentially providing insights for mimicking human motion motor control mechanism.