Designing Multi-Form Evolutionary Algorithms for Multi-Objective Optimization

多目標優化的多形式進化算法設計

Student thesis: Doctoral Thesis

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Awarding Institution
Supervisors/Advisors
  • Shiqi WANG (Supervisor)
  • Kay Chen Tan (External person) (External Co-Supervisor)
  • Tak Wu Sam KWONG (External person) (External Co-Supervisor)
Award date20 Dec 2023

Abstract

Multiobjective optimization problems (MOPs) belong to a widespread class of challenging optimization problems. However, in addition to involving multiple conflicting objectives, MOPs in many practical applications encompass additional complexities that arise from various factors, such as the scale of decision variables, the nature of objectives and constraints, and the presence of combinatorial and discrete properties. Faced with complex MOPs, solving the original task directly can be challenging or require significant computing resources. The emerging multiform optimization (MFO) framework, which exploits alternate formulations of a single target task of interest to capture a variety of search landscapes to provide search experiences for the original problem, has shown great potential in efficiently solving these complex MOPs. Nevertheless, in different scenarios, the MFO search paradigm should be customized to deal with the new challenges brought by complex MOPs, including large-scale decision variables, multi-level decision makers, and combinatorial and discrete search spaces. Therefore, this thesis focuses on studying and designing multiform evolutionary algorithms to address those challenging complex MOPs. The main contributions are summarized as follows:

Firstly, the MFO search paradigm is customized by constructing low-dimensional simple problems in a multi-variation manner and performing evolutionary searches on the original space of the large-scale multiobjective optimization problems (LSMOPs) and multiple simplified spaces concurrently, aiming to solve LSMOPs efficiently. Existing transformation-based methods have shown promising search efficiency for solving LSMOPs, which vary the original problem as a new simplified problem and perform the optimization in simplified spaces to improve the performance. However, it is worth noting that the original problem has changed after the variation, and there is thus no guarantee of the preservation of the original global or near-global optimum in the newly generated space. In this part, we solve LSMOPs via a multi-variation multifactorial evolutionary algorithm, which conducts an evolutionary search on both the original and multiple simplified spaces concurrently. In this way, useful traits found along the search can be seamlessly transferred from the simplified problem spaces to the original problem space toward efficient problem-solving. Besides, since the evolutionary search is also performed in the original problem space, preserving the original global optimal solution can be guaranteed. The experiment results highlight the efficiency and effectiveness of the proposed method compared to the state-of-the-art methods for solving LSMOPs.

Secondly, the applicability of the MFO framework is extended by constructing alternative formulations of the original objective function via relaxing the lower-level constraints and simplifying the decision-making process, aiming to solve bilevel multiobjective optimization problems (BLMOPs) effectively. Many practical MOPs involve more than one decision-making level and can be modeled as a BLMOP with a nested structure of decision variables. In this part, we implement an evolutionary multiform optimization paradigm, namely BLMFO, for bilevel multiobjective optimization. In the proposed framework, alternate formulations of the original problem are derived to facilitate the current problem-solving and alleviate the computational overheads. Then, BLMFO performs the evolutionary search in the original problem space and the auxiliary task space simultaneously to combine searching for feasible solutions and exploring regions of promising solutions, thus ensuring the effectiveness of the proposed framework. Further, useful information is transferred across the original and auxiliary tasks via explicit knowledge transfer to enable complementary exploration for better optimization performance. To the best of our knowledge, this work serves as the first attempt to solve BLMOPs via multiform evolutionary optimization in the literature. The experimental results show the effectiveness and superiority of the proposed framework in terms of performance indicators and the quality of final optimized solutions.

Lastly, a novel MFO framework is designed to address combinatorial multiobjective optimization problems in the case of feature selection (FS) in high-dimensional classification. The proposed algorithm focuses on effectively acquiring and transferring task-specific knowledge embedded in different problem formulations. By introducing diverse problem formulation approaches to generate auxiliary tasks, the complementary advantages of each task are leveraged to promote the discovery of high-quality solutions and potential information and the exploration of fitness landscapes. The proposed multi-solver-based multitask optimization scheme employs independent evolutionary solvers with different biases and search preferences to solve each task, thus improving optimization efficiency and search performance. Additionally, an explicitly task-specific knowledge transfer mechanism is integrated into the scheme to enhance the discovery and transfer of high-quality solutions and task-specific advantageous information during the search process, improving the quality and diversity of the transferred knowledge. Extensive empirical results demonstrate that the proposed method can obtain a feature subset with better classification performance and smaller size compared to several state-of-the-art FS methods on high-dimensional datasets.

    Research areas

  • Multiform Optimization, Multiobjective Optimization, Evolutionary Transfer Optimization, Large-Scale Optimization, Bilevel Optimization, Feature Selection