TY - GEN
T1 - A Dynamic Generalized Opposition-Based Learning Fruit Fly Algorithm for Function Optimization
AU - Feng, Xiaoyi
AU - Liu, Ao
AU - Sun, Weiliang
AU - Yue, Xiaofeng
AU - Liu, Bo
PY - 2018/7
Y1 - 2018/7
N2 - As a novel evolutionary algorithm, fruit fly optimization algorithm (FOA) has received great attentions and wide applications in recent years. However, existing literature have demonstrated that the basic FOA often risks getting prematurely stuck in the local optima. In this paper, an improved FOA, named as dynamic generalized opposition-based learning fruit fly optimization algorithm (DGOBL-FOA), is proposed to mitigate the aforementioned drawback hence improve the optimization performance. Three carefully designed operators are incorporated into the basic FOA, i.e., a cloud model based osphresis search is applied to enhance the local refinement search ability in the osphresis phase, then a generalized opposition-based learning operation is adopted to strengthen the global coarse search ability, meanwhile a dynamic shrinking parameter strategy is designed to adjust the learning intensity and narrow down the search space iteratively, which contributes to a good balance between the global exploration and local exploitation. To verify the effectiveness of the proposed algorithm, numerical experiments are conducted on 18 well-studied benchmark functions with dimension of 30. The computation results and statistical analysis indicate that the proposed DGOBL-FOA achieve significantly better performance comparing to other FOA variants and the state-of-the-art metaheuristics.
AB - As a novel evolutionary algorithm, fruit fly optimization algorithm (FOA) has received great attentions and wide applications in recent years. However, existing literature have demonstrated that the basic FOA often risks getting prematurely stuck in the local optima. In this paper, an improved FOA, named as dynamic generalized opposition-based learning fruit fly optimization algorithm (DGOBL-FOA), is proposed to mitigate the aforementioned drawback hence improve the optimization performance. Three carefully designed operators are incorporated into the basic FOA, i.e., a cloud model based osphresis search is applied to enhance the local refinement search ability in the osphresis phase, then a generalized opposition-based learning operation is adopted to strengthen the global coarse search ability, meanwhile a dynamic shrinking parameter strategy is designed to adjust the learning intensity and narrow down the search space iteratively, which contributes to a good balance between the global exploration and local exploitation. To verify the effectiveness of the proposed algorithm, numerical experiments are conducted on 18 well-studied benchmark functions with dimension of 30. The computation results and statistical analysis indicate that the proposed DGOBL-FOA achieve significantly better performance comparing to other FOA variants and the state-of-the-art metaheuristics.
KW - cloud model
KW - dynamic shrinking strategy
KW - fruit fly optimization algorithm
KW - generalized opposition-based learning
UR - http://www.scopus.com/inward/record.url?scp=85056287263&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85056287263&origin=recordpage
U2 - 10.1109/CEC.2018.8477794
DO - 10.1109/CEC.2018.8477794
M3 - RGC 32 - Refereed conference paper (with host publication)
SN - 9781509060184
T3 - IEEE Congress on Evolutionary Computation - Proceedings
BT - 2018 IEEE Congress on Evolutionary Computation, CEC 2018 - Proceedings
PB - IEEE
T2 - 2018 IEEE Congress on Evolutionary Computation, CEC 2018
Y2 - 8 July 2018 through 13 July 2018
ER -