Towards Arbitrary Local Editing of Neural 3D Shape Representations
R.C. Webb (TU Delft - Mechanical Engineering)
C.A. Raman – Mentor (TU Delft - Pattern Recognition and Bioinformatics)
Holger Caesar – Mentor (TU Delft - Intelligent Vehicles)
P. Kellnhofer – Graduation committee member (TU Delft - Computer Graphics and Visualisation)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
The recent explosion in research on 3D generative AI has shown that learning based editing methods can generate a wide range of shapes in an intuitive and low effort manner. However, current methods show limited local editing control due to the coarse shape representations they are based on, as well as shared or propagated geometry information propagating deformations to a larger region. Methods for theoretically unlimited fine editing exist, however they generally lack shape understanding, which is essential for reducing the effort involved in 3D shape editing. We propose a flexible shape representation in combination with an editing procedure to produce theoretically arbitrary fine local edits with learned shape understanding. Our work builds upon the learning based shape representation and generation method SPAGHETTI, which enables meaningful part mixing and interpolation. Our method enables the selection of a local surface region, which can be transferred through mixing or interpolation to a target shape guided by learned semantics. To empirically assess the locality of edits, we propose a new metric, which we use to conduct experiments on two baseline local shape mixing and interpolation methods. Our locality metric and surface displacement visualizations show that our method achieves more localized edits than the baseline methods, which yield significant deformation propagation.