4D Feet

Registering Walking Foot Shapes Using Attention Enhanced Dynamic-Synchronized Graph Convolutional LSTM Network

Journal Article (2024)
Author(s)

F. Tajdari (TU Delft - Emerging Materials, Eindhoven University of Technology, TU Delft - Intelligent Vehicles)

T. Huysmans (Universiteit Antwerpen, TU Delft - Human Factors)

Xinhe Yao (TU Delft - Emerging Materials)

Jun Xu (TU Delft - Emerging Materials)

Maryam Zebarjadi (University of Minnesota)

Yu (Wolf) Song (TU Delft - Emerging Materials)

Research Group
Emerging Materials
DOI related publication
https://doi.org/10.1109/OJCS.2024.3406645
More Info
expand_more
Publication Year
2024
Language
English
Research Group
Emerging Materials
Volume number
5
Pages (from-to)
343-355
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

4D-scans of dynamic deformable human body parts help researchers have a better understanding of spatiotemporal features. However, reconstructing 4D-scans utilizing multiple asynchronous cameras encounters two main challenges: 1) finding dynamic correspondences among different frames captured by each camera at the timestamps of the camera in terms of dynamic feature recognition, and 2) reconstructing 3D-shapes from the combined point clouds captured by different cameras at asynchronous timestamps in terms of multi-view fusion. Here, we introduce a generic framework able to 1) find and align dynamic features in the 3D-scans captured by each camera using the nonrigid-iterative-closest-farthestpoints algorithm; 2) synchronize scans captured by asynchronous cameras through a novel ADGC-LSTMbased-network capable of aligning 3D-scans captured by different cameras to the timeline of a specific camera; and 3) register a high-quality template to synchronized scans at each timestamp to form a highquality 3D-mesh model using a non-rigid registration method. With a newly developed 4D-foot-scanner, we validate the framework and create the first open-access data-set, namely the 4D-feet. It includes 4Dshapes (15 fps) of the right and left feet of 58 participants (116 feet including 5147 3D-frames), covering significant phases of the gait cycle. The results demonstrate the effectiveness of the proposed framework, especially in synchronizing asynchronous 4D-scans.