Measuring the Performance of Simulation Metamodel
DescriptionSimulation metamodels have been widely used as computationally inexpensive representations of complex simulation models. While the use of metamodel offers computational advantages, it also causes doubt about the fidelity of the metamodel, i.e., whether the metamodel is a good representation of the simulation model for the intended purpose. Measurement of the performance of a metamodel is therefore a necessary step in simulation metamodeling. The problem is a challenging one because the functional form of the simulation model is typically unknown. In this project, we propose a new measure for simulation metamodel performance that does not require knowing the functional form of the simulation model. The proposed measure serves as an absolute measure of the performance of metamodel that can be used to inform the simulation user about the fidelity of the metamodel. We plan to extend the method to the construction of misspecification tests for metamodels, and two other applications, including input uncertainty quantification and Monte Carlo model calibration. In input uncertainty quantification, we plan to apply the method to estimate the moments of the simulation response as a function of the input modeling data, based on which we may also estimate its quantile and density function by using cumulant-based expansions. In Monte Carlo model calibration, we plan to estimate the error function using the proposed method, minimizing of which leads to an estimate of the calibrated model parameters. This procedure does not require a closed-form formula and is thus generally applicable. We plan to study this procedure in greater detail and compare it to existing calibration methods to better understand its merits and drawbacks.
|Effective start/end date||1/01/19 → 27/06/22|