DenseUNets with feedback non-local attention for the segmentation of specular microscopy images of the corneal endothelium with guttae

Journal Article (2022)
Author(s)

Juan P. Vigueras-Guillén (TU Delft - ImPhys/Computational Imaging)

Jeroen van Rooij (Rotterdam Eye Hospital)

Bart T.H. van Dooren (Amphia Hospital, Erasmus MC)

Hans G. Lemij (Rotterdam Eye Hospital)

Esma Islamaj (Rotterdam Eye Hospital)

Lucas J. van Vliet (TU Delft - ImPhys/Computational Imaging)

Koenraad A. Vermeer (TU Delft - ImPhys/Computational Imaging, Novo Research Consultancy, Voorburg)

Research Group
ImPhys/Computational Imaging
DOI related publication
https://doi.org/10.1038/s41598-022-18180-1
More Info
expand_more
Publication Year
2022
Language
English
Research Group
ImPhys/Computational Imaging
Issue number
1
Volume number
12
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Corneal guttae, which are the abnormal growth of extracellular matrix in the corneal endothelium, are observed in specular images as black droplets that occlude the endothelial cells. To estimate the corneal parameters (endothelial cell density [ECD], coefficient of variation [CV], and hexagonality [HEX]), we propose a new deep learning method that includes a novel attention mechanism (named fNLA), which helps to infer the cell edges in the occluded areas. The approach first derives the cell edges, then infers the well-detected cells, and finally employs a postprocessing method to fix mistakes. This results in a binary segmentation from which the corneal parameters are estimated. We analyzed 1203 images (500 contained guttae) obtained with a Topcon SP-1P microscope. To generate the ground truth, we performed manual segmentation in all images. Several networks were evaluated (UNet, ResUNeXt, DenseUNets, UNet++, etc.) and we found that DenseUNets with fNLA provided the lowest error: a mean absolute error of 23.16 [cells/mm2] in ECD, 1.28 [%] in CV, and 3.13 [%] in HEX. Compared with Topcon’s built-in software, our error was 3–6 times smaller. Overall, our approach handled notably well the cells affected by guttae, detecting cell edges partially occluded by small guttae and discarding large areas covered by extensive guttae.