One model, denoise them all!
A Comprehensive Investigation of Denoising Transfer Learning
D. Mullaj (TU Delft - Electrical Engineering, Mathematics and Computer Science)
Osman Semih Kayhan – Mentor (TU Delft - Pattern Recognition and Bioinformatics)
J.C. van Gemert – Mentor (TU Delft - Pattern Recognition and Bioinformatics)
MM De Weerdt – Graduation committee member (TU Delft - Algorithmics)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Deep convolutional neural networks (CNNs) have achieved current state-of-the-art in image denoising, but require large datasets for training. Their performance remains limited on smaller real-noise datasets. In this paper, we investigate robust deep learning denoising using transfer learning. We explore the impact of dataset sizes, CNN parameter updates, and noise distribution similarity. Our findings demonstrate that finetuning the decoder while fixing the encoder of an encoder-decoder network architecture avoids overfitting during transfer learning. Moreover, we introduce the concept of noise similarity in transfer learning, showing that reducing the similarity distance between pre-training and finetuning noise significantly enhances CNN performance in denoising. Our results are demonstrated on multiple datasets and various noise camera models. Finally, we validate the robustness and applicability of our approach on real-noise images. All our results and analyses hold and generalize across different datasets, highlighting our insights into the potential of transfer learning for image denoising.