Surrogate Model-assisted Evolutionary Algorithms and Their Applications
代理模型輔助的進化算法及其應用
Student thesis: Doctoral Thesis
Author(s)
Related Research Unit(s)
Detail(s)
Awarding Institution | |
---|---|
Supervisors/Advisors |
|
Award date | 11 Jun 2021 |
Link(s)
Permanent Link | https://scholars.cityu.edu.hk/en/theses/theses(e2193ae2-c9c8-473a-b1e7-60d39aa1049e).html |
---|---|
Other link(s) | Links |
Abstract
Surrogate model-assisted evolutionary algorithms have achieved great success in solving various optimization problems, especially for costly optimization problems (or expensive optimization problems). To further enhance their efficiency for solving more and more complicated real-world optimization problems, this thesis proposes and investigates several surrogate model-assisted evolutionary algorithms for dealing with various optimization problems. The major contributions include:
1. A Three-level Radial Basis Function-assisted optimization algorithm (called TLRBF) is proposed for expensive single-objective optimization. It consists of three search procedures at each iteration: the global exploration search is to find a solution by optimizing a global RBF approximation function subject to distance constraint in the whole search space; the subregion search is to generate a solution by minimizing an RBF approximation function in a subregion determined by fuzzy clustering; and the local exploitation search is to generate a solution by solving a local RBF approximation model in the neighborhood of the current best solution. Compared with some other state-of-the-art algorithms on five commonly-used scalable benchmark problems, ten CEC2015 computationally expensive problems, and a real-world airfoil design optimization problem, TLRBF performs better for expensive optimization.
2. A Radial Basis Function-assisted optimization algorithm with a Batch infill Sampling criterion (called RBFBS) is presented for solving expensive single-objective optimization problems. In RBFBS, the RBF model’s quality is adjusted by choosing a good shape parameter obtained by solving a sub-expensive hyperparameter optimization problem. Moreover, a batch infill sampling i criterion that includes a bi-objective-based sampling approach and a single-objective-based sampling approach is proposed to obtain a batch of samples for expensive evaluation. Numerical experiments demonstrate that our proposed algorithm performs much better than some other state-of-the-art evolutionary algorithms.
3. An evolutionary algorithm using Multiple Penalties and Multiple Local Surrogates (called MPMLS) is developed for expensive constrained optimization. In each generation, MPMLS defines and optimizes a number of subproblems. Each subproblem penalizes the constraints in the original problem using a different penalty coefficient and has its own search subregion. A local surrogate model is built for optimizing each subproblem. Two major advantages of MPMLS are: 1) it can maintain good population diversity so that the search can approach the optimal solution of the original problem from different directions, and 2) it only needs to build local surrogates so that the computational overhead of the model building can be reduced. Numerical experiments demonstrate that MPMLS performs much better than some other state-of-the-art evolutionary algorithms for expensive constrained optimization.
4. A Multitask Feature Selection model is proposed for Objective Reduction (called MTFSOR) in many-objective optimization. In the proposed method, each objective is formulated as a positive linear combination of a small number of essential objectives, and sparse regularization is employed to identify redundant objectives. Numerical experiments show the effectiveness and robustness of the proposed method by comparing it with some state-of-the-art objective reduction methods.
1. A Three-level Radial Basis Function-assisted optimization algorithm (called TLRBF) is proposed for expensive single-objective optimization. It consists of three search procedures at each iteration: the global exploration search is to find a solution by optimizing a global RBF approximation function subject to distance constraint in the whole search space; the subregion search is to generate a solution by minimizing an RBF approximation function in a subregion determined by fuzzy clustering; and the local exploitation search is to generate a solution by solving a local RBF approximation model in the neighborhood of the current best solution. Compared with some other state-of-the-art algorithms on five commonly-used scalable benchmark problems, ten CEC2015 computationally expensive problems, and a real-world airfoil design optimization problem, TLRBF performs better for expensive optimization.
2. A Radial Basis Function-assisted optimization algorithm with a Batch infill Sampling criterion (called RBFBS) is presented for solving expensive single-objective optimization problems. In RBFBS, the RBF model’s quality is adjusted by choosing a good shape parameter obtained by solving a sub-expensive hyperparameter optimization problem. Moreover, a batch infill sampling i criterion that includes a bi-objective-based sampling approach and a single-objective-based sampling approach is proposed to obtain a batch of samples for expensive evaluation. Numerical experiments demonstrate that our proposed algorithm performs much better than some other state-of-the-art evolutionary algorithms.
3. An evolutionary algorithm using Multiple Penalties and Multiple Local Surrogates (called MPMLS) is developed for expensive constrained optimization. In each generation, MPMLS defines and optimizes a number of subproblems. Each subproblem penalizes the constraints in the original problem using a different penalty coefficient and has its own search subregion. A local surrogate model is built for optimizing each subproblem. Two major advantages of MPMLS are: 1) it can maintain good population diversity so that the search can approach the optimal solution of the original problem from different directions, and 2) it only needs to build local surrogates so that the computational overhead of the model building can be reduced. Numerical experiments demonstrate that MPMLS performs much better than some other state-of-the-art evolutionary algorithms for expensive constrained optimization.
4. A Multitask Feature Selection model is proposed for Objective Reduction (called MTFSOR) in many-objective optimization. In the proposed method, each objective is formulated as a positive linear combination of a small number of essential objectives, and sparse regularization is employed to identify redundant objectives. Numerical experiments show the effectiveness and robustness of the proposed method by comparing it with some state-of-the-art objective reduction methods.