PSSNet

Planarity-sensible Semantic Segmentation of large-scale urban meshes

Journal Article (2023)
Authors

Weixiao Gao (TU Delft - Urban Data Science)

L. Nan (TU Delft - Urban Data Science)

Bas Boom (CycloMedia Technology)

H. Ledoux (TU Delft - Urban Data Science)

Research Group
Urban Data Science
Copyright
© 2023 W. Gao, L. Nan, Bas Boom, H. Ledoux
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 W. Gao, L. Nan, Bas Boom, H. Ledoux
Research Group
Urban Data Science
Volume number
196
Pages (from-to)
32-44
DOI:
https://doi.org/10.1016/j.isprsjprs.2022.12.020
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

We introduce a novel deep learning-based framework to interpret 3D urban scenes represented as textured meshes. Based on the observation that object boundaries typically align with the boundaries of planar regions, our framework achieves semantic segmentation in two steps: planarity-sensible over-segmentation followed by semantic classification. The over-segmentation step generates an initial set of mesh segments that capture the planar and non-planar regions of urban scenes. In the subsequent classification step, we construct a graph that encodes the geometric and photometric features of the segments in its nodes and the multi-scale contextual features in its edges. The final semantic segmentation is obtained by classifying the segments using a graph convolutional network. Experiments and comparisons on two semantic urban mesh benchmarks demonstrate that our approach outperforms the state-of-the-art methods in terms of boundary quality, mean IoU (intersection over union), and generalization ability. We also introduce several new metrics for evaluating mesh over-segmentation methods dedicated to semantic segmentation, and our proposed over-segmentation approach outperforms state-of-the-art methods on all metrics. Our source code is available at https://github.com/WeixiaoGao/PSSNet.