TY - JOUR
T1 - Surrogate-assisted neural learning and evolutionary optimization for expensive constrained multi-objective problems
AU - Li, Wenji
AU - Qiu, Yifeng
AU - Wang, Zhaojun
AU - Xu, Biao
AU - Hao, Zhifeng
AU - Zhang, Qingfu
AU - Li, Yun
AU - Fan, Zhun
PY - 2025/8
Y1 - 2025/8
N2 - Expensive constrained multi-objective optimization problems (ECMOPs) present significant challenges due to the high computational cost of evaluating objective and constraint functions, which severely limits the number of feasible function evaluations. To address this issue, we propose an efficient surrogate-assisted constrained multi-objective evolutionary algorithm, named LEMO. LEMO integrates neural learning with a novel constraint screening strategy to dynamically construct surrogate models for the most relevant constraints. During the optimization process, a neural network is designed to learn the mapping between arbitrary weight vectors and their corresponding constrained Pareto optimal solutions. This enables the generation of high-quality solutions while requiring fewer expensive function evaluations. Additionally, a constraint screening mechanism is introduced to dynamically exclude constraints that are irrelevant to the current search phase, thus simplifying the surrogate models and improving the efficiency of the constrained search process. To evaluate the effectiveness of LEMO, we compare its performance against seven state-of-the-art algorithms on three benchmark suites, LIRCMOP, DASCMOP, and MW, as well as a real-world optimization problem. The experimental results demonstrate that LEMO consistently outperforms these algorithms in both computational efficiency and solution quality. © 2025 Published by Elsevier B.V.
AB - Expensive constrained multi-objective optimization problems (ECMOPs) present significant challenges due to the high computational cost of evaluating objective and constraint functions, which severely limits the number of feasible function evaluations. To address this issue, we propose an efficient surrogate-assisted constrained multi-objective evolutionary algorithm, named LEMO. LEMO integrates neural learning with a novel constraint screening strategy to dynamically construct surrogate models for the most relevant constraints. During the optimization process, a neural network is designed to learn the mapping between arbitrary weight vectors and their corresponding constrained Pareto optimal solutions. This enables the generation of high-quality solutions while requiring fewer expensive function evaluations. Additionally, a constraint screening mechanism is introduced to dynamically exclude constraints that are irrelevant to the current search phase, thus simplifying the surrogate models and improving the efficiency of the constrained search process. To evaluate the effectiveness of LEMO, we compare its performance against seven state-of-the-art algorithms on three benchmark suites, LIRCMOP, DASCMOP, and MW, as well as a real-world optimization problem. The experimental results demonstrate that LEMO consistently outperforms these algorithms in both computational efficiency and solution quality. © 2025 Published by Elsevier B.V.
KW - Expensive constrained multi-objective optimization
KW - Neural learning-based solution generation
KW - Surrogate-assisted optimization
UR - https://www.webofscience.com/wos/woscc/full-record/WOS:001520758700001
UR - http://www.scopus.com/inward/record.url?scp=105008656434&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-105008656434&origin=recordpage
U2 - 10.1016/j.swevo.2025.102020
DO - 10.1016/j.swevo.2025.102020
M3 - RGC 21 - Publication in refereed journal
SN - 2210-6502
VL - 97
JO - Swarm and Evolutionary Computation
JF - Swarm and Evolutionary Computation
M1 - 102020
ER -