TY - JOUR
T1 - Novel Multitask Conditional Neural-Network Surrogate Models for Expensive Optimization
AU - Luo, Jianping
AU - Chen, Liang
AU - Li, Xia
AU - Zhang, Qingfu
PY - 2022/5
Y1 - 2022/5
N2 - Multiple-related tasks can be learned simultaneously by sharing information among tasks to avoid tabula rasa learning and to improve performance in the no transfer case (i.e., when each task learns in isolation). This study investigates multitask learning with conditional neural process (CNP) networks and proposes two multitask learning network models on the basis of CNPs, namely, the one-to-many multitask CNP (OMc-MTCNP) and the many-to-many MTCNP (MMc-MTCNP). Compared with existing multitask models, the proposed models add an extensible correlation learning layer to learn the correlation among tasks. Moreover, the proposed multitask CNP (MTCNP) networks are regarded as surrogate models and applied to a Bayesian optimization framework to replace the Gaussian process (GP) to avoid the complex covariance calculation. The proposed Bayesian optimization framework simultaneously infers multiple tasks by utilizing the possible dependencies among them to share knowledge across tasks. The proposed surrogate models augment the observed dataset with a number of related tasks to estimate model parameters confidently. The experimental studies under several scenarios indicate that the proposed algorithms are competitive in performance compared with GP-, single-task-, and other multitask model-based Bayesian optimization methods.
AB - Multiple-related tasks can be learned simultaneously by sharing information among tasks to avoid tabula rasa learning and to improve performance in the no transfer case (i.e., when each task learns in isolation). This study investigates multitask learning with conditional neural process (CNP) networks and proposes two multitask learning network models on the basis of CNPs, namely, the one-to-many multitask CNP (OMc-MTCNP) and the many-to-many MTCNP (MMc-MTCNP). Compared with existing multitask models, the proposed models add an extensible correlation learning layer to learn the correlation among tasks. Moreover, the proposed multitask CNP (MTCNP) networks are regarded as surrogate models and applied to a Bayesian optimization framework to replace the Gaussian process (GP) to avoid the complex covariance calculation. The proposed Bayesian optimization framework simultaneously infers multiple tasks by utilizing the possible dependencies among them to share knowledge across tasks. The proposed surrogate models augment the observed dataset with a number of related tasks to estimate model parameters confidently. The experimental studies under several scenarios indicate that the proposed algorithms are competitive in performance compared with GP-, single-task-, and other multitask model-based Bayesian optimization methods.
KW - Task analysis
KW - Correlation
KW - Optimization
KW - Neural networks
KW - Bayes methods
KW - Linear programming
KW - Computational modeling
KW - Evolutionary optimization
KW - Gaussian process (GP)
KW - multitask
KW - neural network
KW - surrogate model
KW - ALGORITHM
KW - SEARCH
UR - http://www.scopus.com/inward/record.url?scp=85123427272&partnerID=8YFLogxK
UR - http://gateway.isiknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=LinksAMR&SrcApp=PARTNER_APP&DestLinkType=FullRecord&DestApp=WOS&KeyUT=000798227800122
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85123427272&origin=recordpage
U2 - 10.1109/TCYB.2020.3014126
DO - 10.1109/TCYB.2020.3014126
M3 - RGC 21 - Publication in refereed journal
SN - 2168-2267
VL - 52
SP - 3984
EP - 3997
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 5
ER -