Neural networks meet physics-based material models

Accelerating concurrent multiscale simulations of path-dependent composite materials

More Info
expand_more

Abstract

In a concurrent (FE2) multiscale modeling is an increasingly popular approach for modeling complex materials. As such, it is especially suited for modeling composites, as their complex microstructure can be explicitly modeled and nested to each integration point of the macroscale. However, this generality is often associated with exceedingly high computational costs for real-scale applications. One way to tackle the issue is to employ a cheaper-to-evaluate surrogate model for the microstructure based on few observations of the high-fidelity solution. On this note, Neural Networks (NN) are by far the most popular technique in building constitutive surrogates. However, conventional NNs assume a unique mapping between strains and stresses, limiting their ability to reproduce path-dependent behavior. Moreover, their data-driven nature severely limits their ability to extrapolate away from their training spaces. To circumvent these drawbacks, the alternative explored in this work is to reintroduce some of the physics-based knowledge of the problem into the NN. This is done by employing actual material models used in the full-order micromodel as the activation function of one of the layers of the network. Thus, path-dependency arises naturally since every material model in the layer has its own internal variables. To assess its capabilities, the network is employed as the surrogate model for a composite Representative Volume Element with elastic fibers and elasto-plastic matrix material. First, for a single micromodel, the performance of the network is compared to that of a state-of-the-art Recurrent Neural Network (RNN) in a number of challenging scenarios for data-driven models. Then, the proposed framework is applied to an FE2 example and the results are compared to the full-order solution in terms of accuracy and computational cost. An important outcome of the physics-infused network is the ability to naturally predict unloading/reloading behavior without ever seeing it during training, a stark contrast with highly popular but data-hungry models such as RNN.