BAYESIAN DEEP OPERATOR LEARNING FOR HOMOGENIZED TO FINE-SCALE MAPS FOR MULTISCALE PDE

Zecheng ZHANG, Christian MOYA, Wing Tat LEUNG, Guang LIN, Hayden SCHAEFFER

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)

Abstract

We present a new framework for computing fine-scale solutions of multiscale partial differential equations (PDEs) using operator learning tools. Obtaining fine-scale solutions of multiscale PDEs can be challenging, but there are many inexpensive computational methods for obtaining coarse-scale solutions. Additionally, in many real-world applications, fine-scale solutions can only be observed at a limited number of locations. In order to obtain approximations or predictions of fine-scale solutions over general regions of interest, we propose to learn the operator mapping from coarse-scale solutions to fine-scale solutions using observations of a limited number of (possible noisy) fine-scale solutions. The approach is to train multi-fidelity homogenization maps using mathematically motivated neural operators. The operator learning framework can efficiently obtain the solution of multiscale PDEs at any arbitrary point, making our proposed framework a mesh-free solver. We verify our results on multiple numerical examples showing that our approach is an efficient mesh-free solver for multiscale PDEs. © 2024 by SIAM.
Original languageEnglish
Pages (from-to)956-972
Number of pages17
JournalMultiscale Modeling & Simulation
Volume22
Issue number3
Online published17 Jul 2024
DOIs
Publication statusPublished - Sept 2024

Research Keywords

  • neural operator
  • neural homogenization
  • multiscale finite element method
  • dis- cretization invariant
  • multi-fidelity

Fingerprint

Dive into the research topics of 'BAYESIAN DEEP OPERATOR LEARNING FOR HOMOGENIZED TO FINE-SCALE MAPS FOR MULTISCALE PDE'. Together they form a unique fingerprint.

Cite this