Material-informed Gaussian Splatting for 3D World Reconstruction in Digital Twin

Master Thesis (2025)
Author(s)

A.K.G.H. Huynh (TU Delft - Mechanical Engineering)

Contributor(s)

João Malheiro Silva – Mentor (Siemens Digital Industries Software )

Holger Caesar – Mentor (TU Delft - Intelligent Vehicles)

Son Tong – Mentor (Siemens Digital Industries Software )

J.F.P. Kooij – Graduation committee member (TU Delft - Intelligent Vehicles)

Faculty
Mechanical Engineering
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
15-11-2025
Awarding Institution
Delft University of Technology
Programme
['Mechanical Engineering | Vehicle Engineering | Cognitive Robotics']
Faculty
Mechanical Engineering
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

3D reconstruction for Digital Twin has become crucial, providing controlled and scalable environments for developing and validating advanced sensor algorithms before real-world deployment. Traditional reconstruction approaches have several limitations: they often overlook physical rendering properties of materials, lack detailed textures, and rely on lidar sensors that require complex calibration and perform poorly with capturing retro-reflective and transparent surfaces. We propose a modular, camera-centric, and material-informed 3D Gaussian Splatting (3DGS) pipeline for reconstructing large 3D scenes. Our approach extracts semantic material masks using segmentation models, converts Gaussian representations to explicit mesh surfaces, and automatically projects 2D material labels onto 3D geometry. This combines photorealistic reconstruction with physics-based material assignment for accurate sensor simulation and rendering in modern graphics engines and simulators. Evaluation on an internal Siemens autonomous driving dataset demonstrates that our material-informed approach achieves sensor simulation fidelity comparable to lidar-based ground truth while providing high visual fidelity, as validated through image similarity metrics and lidar reflectivity analysis.

Files

License info not available