Print Email Facebook Twitter Deep limits of residual neural networks Title Deep limits of residual neural networks Author Thorpe, Matthew (The University of Manchester; The Alan Turing Institute) van Gennip, Y. (TU Delft Mathematical Physics) Date 2023 Abstract Neural networks have been very successful in many applications; we often, however, lack a theoretical understanding of what the neural networks are actually learning. This problem emerges when trying to generalise to new data sets. The contribution of this paper is to show that, for the residual neural network model, the deep layer limit coincides with a parameter estimation problem for a nonlinear ordinary differential equation. In particular, whilst it is known that the residual neural network model is a discretisation of an ordinary differential equation, we show convergence in a variational sense. This implies that optimal parameters converge in the deep layer limit. This is a stronger statement than saying for a fixed parameter the residual neural network model converges (the latter does not in general imply the former). Our variational analysis provides a discrete-to-continuum Γ -convergence result for the objective function of the residual neural network training step to a variational problem constrained by a system of ordinary differential equations; this rigorously connects the discrete setting to a continuum problem. Subject Deep layer limitsDeep neural networksGamma-convergenceOrdinary differential equationsRegularityVariational convergence To reference this document use: http://resolver.tudelft.nl/uuid:d6fa0e23-1c2b-48f2-a676-2d843b109342 DOI https://doi.org/10.1007/s40687-022-00370-y ISSN 2522-0144 Source Research in Mathematical Sciences, 10 (1) Part of collection Institutional Repository Document type journal article Rights © 2023 Matthew Thorpe, Y. van Gennip Files PDF s40687_022_00370_y.pdf 815.41 KB Close viewer /islandora/object/uuid:d6fa0e23-1c2b-48f2-a676-2d843b109342/datastream/OBJ/view