This study presented a fully automated deep learning based markerless motion capture workflow and evaluated its performance against marker-based motion capture during overground running, walking and counter movement jumping. Multi-view high speed (200 Hz) image data were collected concurrently with marker-based motion capture (criterion data), permitting a direct comparison between methods. Lower limb kinematic data for 15 participants were computed using 2D pose estimation, our 3D fusion process and OpenSim based inverse kinematics modelling. Results demonstrated high levels of agreement for lower limb joint angles, with mean differences ranging “0.1° - 10.5° for hip (3 DoF) joint rotations, and 0.7° - 3.9° for knee (1 DoF) and ankle (2 DoF) rotations.. These differences generally fall within the documented uncertainties of marker-based motion capture, suggesting that our markerless approach could be used for appropriate biomechanics applications. We used an open-source, modular and customisable workflow, allowing for integration with other popular biomechanics tools such as OpenSim. By developing open-source tools, we hope to facilitate the democratisation of markerless motion capture technology and encourage the transparent development of markerless methods. This presents exciting opportunities for biomechanics researchers and practitioners to capture large amounts of high quality, ecologically valid data both in the laboratory and in the wild.
Original languageEnglish
Article number111338
Number of pages9
JournalJournal of Biomechanics
Early online date2 Oct 2022
Publication statusPublished - 30 Nov 2022


  • Biomechanics
  • Computer vision
  • Deep learning
  • Inverse kinematics
  • Pose estimation
  • Validation

ASJC Scopus subject areas

  • Biophysics
  • Orthopedics and Sports Medicine
  • Biomedical Engineering
  • Rehabilitation


Dive into the research topics of 'The development and evaluation of a fully automated markerless motion capture workflow'. Together they form a unique fingerprint.

Cite this