Vision-Based Real-Time Fingertip Force and Tissue-Stiffness Estimation for Next-Generation Human–Robot Interaction

MSc assignment

Introduction

Accurately sensing how a human fingertip slides and presses across soft tissue in real time is essential for applications such as physical-therapy assessment, palpation training, and intuitive tele-touch interfaces. Traditional solutions rely on bulky load cells or instrumented gloves that constrain natural motion, are expensive, and are hard to sterilise. With the advent of compact RGB-D cameras and marker-less hand-tracking frameworks (e.g., MediaPipe Hands), we can now capture both surface-deformation maps and fingertip joint kinematics without any wearable hardware.

This thesis proposes an entirely vision-based pipeline that, while a participant simply slides a bare finger over a material surface, delivers two outputs at video rate:

  • the full 3-D contact-force vector exerted by the fingertip, and
  • the apparent Young’s modulus of the contacted region (silicone phantom, forearm, thigh, etc.).

By eliminating dedicated force sensors and focusing purely on natural fingertip interaction, the system lowers cost, preserves tactile realism, and opens the door to rapid tissue-property mapping during rehabilitation exercises or clinical palpation practice.

Objectives

  • to investigate the core feasibility of simultaneous, camera-only force-and-stiffness estimation during sliding fingertip contacts;
  • to set up a synchronised capture platform combining an RGB-D camera, fingertip pose tracking (MediaPipe, ArUco, or similar) and a ground-truth 6-DoF load cell for training/validation;
  • to design and manufacture representative soft-tissue phantoms plus collect datasets on human thigh and forearm under varied sliding angles, speeds, and loads;
  • to develop a lightweight deep-learning model that fuses fingertip joint-link vectors with depth-displacement statistics to output both force direction/magnitude and local Young’s modulus at ≥30 Hz;
  • to validate angular error, force RMSE, and modulus accuracy across diverse contact paths, forces, and tissue types.

Requirements for students

  • solid programming skills in C++ or Python and familiarity with PyTorch
  • interest in computer vision, hand-pose estimation, and soft-tissue modelling;
  • experience in experimental design, calibration, and phantom fabrication is highly desirable.