Database

Strategic

The database that didn't exist yet

Babon is building a large, labelled clinical motion database. The figures below come live from the research database.

LiveRecordings in database71.400Aggregated from public, CC-BY licensed datasets. Base layer for normative comparison and validation.
Unique subjects549From public, CC-BY licensed sources. From practices and clinics: not yet.
Data sources3Public datasets, all CC-BY licensed and fully attributed.
Infrastructurefr-parScaleway Paris. Data does not leave the EU.

Updated: 23 April 2026 · source: research.movalytics-db

Composition of the 71,400 recordings

// breakdown per bron
AddBiomechanics · 69.804GAVD · 1.462LabValidation · 144
AddBiomechanics97,8%
GAVD2%
LabValidation0,2%
Why this database

Context

A large, representative clinical motion database from everyday practice does not currently exist. The only conventional source is a gait lab, where a single measurement costs tens of euros. As a result, analyses like age-norms, recovery patterns after knee surgery, and diagnosis-specific reference data are simply not feasible in practice.

Babon starts with three publicly available, CC-BY licensed datasets, fully attributed below. Because a Movalytics analysis costs cents per recording, the database can be gradually expanded over time through anonymised contributions, with no footage and no patient data.

Source data

With thanks to

Every recording in the research database comes from publicly available, CC-BY licensed datasets. Full attribution below.

[01] Normative reference · healthy population

AddBiomechanics

Werling, K., Bianco, N.A., Raitor, M., Stingel, J., Hicks, J.L., Delp, S.L., Liu, C.K. (2023). AddBiomechanics: Automating model scaling, inverse kinematics, and inverse dynamics from human motion data through sequential optimization. PLoS ONE 18(11): e0295152.

Aggregated marker-based motion capture across 15 contributing studies. Base layer for normative comparison.

+Contributing studies15
  1. Lencioni et al. 201950subjects
  2. Carter et al. 202350subjects
  3. Santos et al. 201749subjects
  4. Camargo et al. 202122subjects
  5. Tan et al. 202317subjects
  6. Moore et al. 201512subjects
  7. Falisse et al. 201611subjects
  8. Van der Zee et al. 202210subjects
  9. Hamner et al. 201310subjects
  10. Uhlrich et al. 202310subjects
  11. Tan et al. 20229subjects
  12. Wang et al. 20239subjects
  13. Han et al. 20237subjects
  14. Fregly et al. 20126subjects
  15. Li et al. 20211subjects
273subjects
69.804lab-mocap trials
CC BY 4.0addbiomechanics.org →
[02] Clinical and atypical gait

GAVD, Gait Abnormality Video Dataset

Ranjan, R., Ahmedt-Aristizabal, D., Ali Armin, M., Kim, J. (2025). Computer Vision for Clinical Gait Analysis: A Gait Abnormality Video Dataset. IEEE Access 13: 45321, 45339. doi:10.1109/ACCESS.2025.3545787

Video-based gait recordings with clinical labels. Supports evaluation of pose extraction on in-the-wild video, beyond lab conditions.

276subjects
1.462videos
CC BY 4.0IEEE Access →
[03] Video ↔ mocap pairs · validation

LabValidation (OpenCap, Stanford)

Uhlrich, S.D., Falisse, A., Kidziński, Ł., Muccini, J., Ko, M., Chaudhari, A.S., Hicks, J.L., Delp, S.L. (2023). OpenCap: Human movement dynamics from smartphone videos. PLoS Computational Biology 19(10): e1011462.

Calibrated video ↔ mocap pairs as gold standard when comparing Movalytics output against classical gait-lab measurements. Subjects are a subset of AddBiomechanics.

144video pairs
subjects shared with AddBiomechanics
CC BY 4.0opencap.ai →
Research partnershipIn conversation with Hogeschool Utrecht, Jaap Jansen, Institute of Movement Studies. The setup is being explored jointly; no data has been contributed yet.

Own database or research question?

Get in touch