Back to Search Start Over

Self-consistency Reinforced minimal Gated Recurrent Unit for surrogate modeling of history-dependent non-linear problems: Application to history-dependent homogenized response of heterogeneous materials.

Authors :
Wu, Ling
Noels, Ludovic
Source :
Computer Methods in Applied Mechanics & Engineering. May2024, Vol. 424, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Multi-scale simulations can be accelerated by substituting the meso-scale problem resolution by a surrogate trained from off-line simulations. In the context of history-dependent materials, Recurrent Neural Networks (RNN) have widely been considered to act as such a surrogate, since their hidden variables allow for a memory effect. However, defining a data-set for the training, which virtually covers all the possible strain–stress state evolution encountered during the online phase, remains a daunting task. This is particularly true in the case in which the strain increment size is expected to vary by several orders of magnitude. Self-Consistent recurrent networks were thus introduced by Bonatti and Mohr (2022) to reinforce the self-consistency of the neural network with respect to the input increment size when acting as a surrogate of an elasto-plastic material model. When designing RNN to act as a surrogate of a meso-scale Boundary Value Problem (BVP) defined by a Representative Volume Element (RVE) of complex micro-structures, the number of learnable parameters required for existing Recurrent Neural Network (RNN) to be accurate could remain high, impeding the training performance. In this work, we revisit and design alternative self-consistent recurrent units in order to limit the number of hidden variables required for the neural network to act as a composite material surrogate in multi-scale simulations. Although the RNNs based on the newly suggested self-consistency reinforced recurrent units have a reduced number of learnable parameters yielding good training performance, they remain accurate in the context of multi-scale simulations considering various strain increment sizes. • Recurrent cells are designed for neural network surrogates in multi-scale analyses. • Variation of input size increments is accounted for through a self-consistent cell. • Recurrent cell and network are optimized to reduce the number of learnable parameters. • The neural network is trained with homogenized response of elasto-plastic composites. • Training performance and prediction accuracy are shown to have been optimized. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00457825
Volume :
424
Database :
Academic Search Index
Journal :
Computer Methods in Applied Mechanics & Engineering
Publication Type :
Academic Journal
Accession number :
176247380
Full Text :
https://doi.org/10.1016/j.cma.2024.116881