Synthetic Human Motion Video Generation Based on Biomechanical Model
B. Lyu (TU Delft - Mechanical Engineering)
Ajay Seth – Mentor (TU Delft - Biomechatronics & Human-Machine Control)
Eline van der Kruk – Graduation committee member (TU Delft - Biomechatronics & Human-Machine Control)
Xucong Zhang – Graduation committee member (TU Delft - Pattern Recognition and Bioinformatics)
Frans C.T. van Der Helm – Graduation committee member (TU Delft - Biomechatronics & Human-Machine Control)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Biomechanics studies the underlying mechanisms between body movements and forces. Accurate motion data is crucial for the biomechanics. Currently, marker-based motion capture systems are often used by researchers to record motion data. Marker-based motion capture systems are not widely adopted due to its drawbacks in terms of financial and time costs, portability, etc. Video-based motion capture systems can record motions using videos collected by webcams, cameras, and smartphones as the input and then estimate human motions from those videos. A simpler setup makes video-based motion capture technology more accessible for widespread use. However, existing motion capture datasets commonly lack of biomechanically accurate annotation, resulting in a deficiency in the biomechanical accuracy of exsisting video-based motion capture methods. In the biomechanics community, there are a lot of validated and biomechanically accurate models and motion data; however, corresponding video data is lacking. We can construct human-like appearance based on these data and generate a synthetic human motion video dataset using 3D graphic software. In this thesis, we purposed a pipeline that can generate synthetic human motion videos. The pipeline takes subject-specific OpenSim model and motion as input and uses SMPL-X model to generate human-like appearance. We validated the synthetic data generated by our pipeline and demonstrated the biomechanical reliability of the pipeline. Using this pipeline, we created synthetic dataset ODAH with biomechanically accurate annotations for neural network training.