AMS-NET : ADAPTIVE MULTISCALE SPARSE NEURAL NETWORK WITH INTERPRETABLE BASIS EXPANSION FOR MULTIPHASE FLOW PROBLEMS
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 618-640 |
Journal / Publication | Multiscale Modeling and Simulation |
Volume | 20 |
Issue number | 2 |
Online published | 23 Jun 2022 |
Publication status | Published - 2022 |
Externally published | Yes |
Link(s)
Abstract
In this work, we propose an adaptive sparse learning algorithm that can be applied to learn the physical processes and obtain a sparse representation of the solution given a large snapshot space. Assume that there is a rich class of precomputed basis functions that can be used to approximate the quantity of interest. For instance, in the simulation of a multiscale flow system, one can adopt mixed multiscale methods to compute velocity bases from local problems and apply the proper orthogonal decomposition method to construct bases for the saturation equation. We then design a neural network architecture to learn the coefficients of solutions in the spaces which are spanned by these basis functions. The information of the basis functions is incorporated in the loss function, which minimizes the differences between the downscaled reduced order solutions and reference solutions at multiple time steps. The network contains multiple submodules and the solutions at different time steps can be learned simultaneously. We propose some strategies in the learning framework to identify important degrees of freedom. To find a sparse solution representation, a soft-thresholding operator is applied to enforce the sparsity of the output coefficient vectors of the neural network. To avoid oversimplification and enrich the approximation space, some degrees of freedom can be added back to the systems through a greedy algorithm. In both scenarios, that is, removing and adding degrees of freedom, the corresponding network connections are pruned or reactivated guided by the magnitude of the solution coefficients obtained from the network outputs. The proposed adaptive learning process is applied to some toy case examples to demonstrate that it can achieve a good basis selection and accurate approximation. More numerical tests are successfully performed on two-phase multiscale flow problems to show the capability and interpretability of the proposed method on complicated applications.
Research Area(s)
- adaptive method, deep neural network, interpretable machine learning, model reduction, multiscale flow dynamics, sparse learning
Citation Format(s)
AMS-NET: ADAPTIVE MULTISCALE SPARSE NEURAL NETWORK WITH INTERPRETABLE BASIS EXPANSION FOR MULTIPHASE FLOW PROBLEMS. / WANG, Yating; LEUNG, Wing Tat; LIN, Guang.
In: Multiscale Modeling and Simulation, Vol. 20, No. 2, 2022, p. 618-640.
In: Multiscale Modeling and Simulation, Vol. 20, No. 2, 2022, p. 618-640.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review