TY - JOUR
T1 - Efficient hybrid explicit-implicit learning for multiscale problems
AU - Efendiev, Yalchin
AU - Leung, Wing Tat
AU - Lin, Guang
AU - Zhang, Zecheng
PY - 2022/10/15
Y1 - 2022/10/15
N2 - Splitting method is a powerful method to handle application problems by splitting physics, scales, domain, and so on. Many splitting algorithms have been designed for efficient temporal discretization. In this paper, our goal is to use temporal splitting concepts in designing machine learning algorithms and, at the same time, help splitting algorithms by incorporating data and speeding them up. We propose a machine learning assisted splitting scheme which improves the efficiency of the scheme meanwhile preserves the accuracy. We consider a recently introduced multiscale splitting algorithms, where the multiscale problem is solved on a coarse grid. To approximate the dynamics, only a few degrees of freedom are solved implicitly, while others explicitly. This splitting concept allows identifying degrees of freedom that need implicit treatment. In this paper, we use this splitting concept in machine learning and propose several strategies. First, the implicit part of the solution can be learned as it is more difficult to solve, while the explicit part can be computed. This provides a speed-up and data incorporation for splitting approaches. Secondly, one can design a hybrid neural network architecture because handling explicit parts requires much fewer communications among neurons and can be done efficiently. Thirdly, one can solve the coarse grid component via PDEs or other approximation methods and construct simpler neural networks for the explicit part of the solutions. We discuss these options and implement one of them by interpreting it as a machine translation task. This interpretation of the splitting scheme successfully enables us using the Transformer since it can perform model reduction for multiple time series and learn the connection between them. We also find that the splitting scheme is a great platform to predict the coarse solution with insufficient information of the target model: the target problem is partially given and we need to solve it through a known problem which approximates the target. Our machine learning model can incorporate and encode the given information from two different problems and then solve the target problems. We conduct four numerical examples and the results show that our method is stable and accurate.
AB - Splitting method is a powerful method to handle application problems by splitting physics, scales, domain, and so on. Many splitting algorithms have been designed for efficient temporal discretization. In this paper, our goal is to use temporal splitting concepts in designing machine learning algorithms and, at the same time, help splitting algorithms by incorporating data and speeding them up. We propose a machine learning assisted splitting scheme which improves the efficiency of the scheme meanwhile preserves the accuracy. We consider a recently introduced multiscale splitting algorithms, where the multiscale problem is solved on a coarse grid. To approximate the dynamics, only a few degrees of freedom are solved implicitly, while others explicitly. This splitting concept allows identifying degrees of freedom that need implicit treatment. In this paper, we use this splitting concept in machine learning and propose several strategies. First, the implicit part of the solution can be learned as it is more difficult to solve, while the explicit part can be computed. This provides a speed-up and data incorporation for splitting approaches. Secondly, one can design a hybrid neural network architecture because handling explicit parts requires much fewer communications among neurons and can be done efficiently. Thirdly, one can solve the coarse grid component via PDEs or other approximation methods and construct simpler neural networks for the explicit part of the solutions. We discuss these options and implement one of them by interpreting it as a machine translation task. This interpretation of the splitting scheme successfully enables us using the Transformer since it can perform model reduction for multiple time series and learn the connection between them. We also find that the splitting scheme is a great platform to predict the coarse solution with insufficient information of the target model: the target problem is partially given and we need to solve it through a known problem which approximates the target. Our machine learning model can incorporate and encode the given information from two different problems and then solve the target problems. We conduct four numerical examples and the results show that our method is stable and accurate.
KW - CEM-GMsFEM
KW - Deep learning
KW - GMsFEM
KW - Multiscale
KW - Partially explicit
UR - http://www.scopus.com/inward/record.url?scp=85133854495&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85133854495&origin=recordpage
U2 - 10.1016/j.jcp.2022.111326
DO - 10.1016/j.jcp.2022.111326
M3 - RGC 21 - Publication in refereed journal
SN - 0021-9991
VL - 467
JO - Journal of Computational Physics
JF - Journal of Computational Physics
M1 - 111326
ER -