iPINNs: incremental learning for Physics-informed neural networks

Journal Article (2024)
Author(s)

A. Dekhovich (TU Delft - Team Marcel Sluiter, TU Delft - Team Michel Verhaegen)

M. H.F. Sluiter (TU Delft - Team Marcel Sluiter)

David M.J. Tax (TU Delft - Pattern Recognition and Bioinformatics)

M. A. Bessa (Brown University)

Research Group
Team Marcel Sluiter
DOI related publication
https://doi.org/10.1007/s00366-024-02010-1
More Info
expand_more
Publication Year
2024
Language
English
Research Group
Team Marcel Sluiter
Bibliographical Note
Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.@en
Issue number
1
Volume number
41 (2025)
Pages (from-to)
389-402
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs). However, finding a set of neural network parameters that fulfill a PDE at the boundary and within the domain of interest can be challenging and non-unique due to the complexity of the loss landscape that needs to be traversed. Although a variety of multi-task learning and transfer learning approaches have been proposed to overcome these issues, no incremental training procedure has been proposed for PINNs. As demonstrated herein, by developing incremental PINNs (iPINNs) we can effectively mitigate such training challenges and learn multiple tasks (equations) sequentially without additional parameters for new tasks. Interestingly, we show that this also improves performance for every equation in the sequence. Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learned subnetworks. We demonstrate that previous subnetworks are a good initialization for a new equation if PDEs share similarities. We also show that iPINNs achieve lower prediction error than regular PINNs for two different scenarios: (1) learning a family of equations (e.g., 1-D convection PDE); and (2) learning PDEs resulting from a combination of processes (e.g., 1-D reaction–diffusion PDE). The ability to learn all problems with a single network together with learning more complex PDEs with better generalization than regular PINNs will open new avenues in this field.

Files

S00366-024-02010-1.pdf
(pdf | 2.49 Mb)
- Embargo expired in 22-12-2024
License info not available