An Improved Differential Evolution Algorithm for Numerical Optimization Problems

Irfan Farda, Arit Thammano

Abstract


The differential evolution algorithm has gained popularity for solving complex optimization problems because of its simplicity and efficiency. However, it has several drawbacks, such as a slow convergence rate, high sensitivity to the values of control parameters, and the ease of getting trapped in local optima. In order to overcome these drawbacks, this paper integrates three novel strategies into the original differential evolution. First, a population improvement strategy based on a multi-level sampling mechanism is used to accelerate convergence and increase the diversity of the population. Second, a new self-adaptive mutation strategy balances the exploration and exploitation abilities of the algorithm by dynamically determining an appropriate value of the mutation parameters; this improves the search ability and helps the algorithm escape from local optima when it gets stuck. Third, a new selection strategy guides the search to avoid local optima. Twelve benchmark functions of different characteristics are used to validate the performance of the proposed algorithm. The experimental results show that the proposed algorithm performs significantly better than the original DE in terms of the ability to locate the global optimum, convergence speed, and scalability. In addition, the proposed algorithm is able to find the global optimal solutions on 8 out of 12 benchmark functions, while 7 other well-established metaheuristic algorithms, namely NBOLDE, ODE, DE, SaDE, JADE, PSO, and GA, can obtain only 6, 2, 1, 1, 1, 1, and 1 functions, respectively.

 

Doi: 10.28991/HIJ-2023-04-02-014

Full Text: PDF


Keywords


Optimization; Differential Evolution; Self-adaptive; Metaheuristic.

References


Tansui, D., & Thammano, A. (2020). Hybrid Nature-Inspired Optimization Algorithm: Hydrozoan and Sea Turtle Foraging Algorithms for Solving Continuous Optimization Problems. IEEE Access, 8, 65780–65800. doi:10.1109/ACCESS.2020.2984023.

Ahmad, M. F., Isa, N. A. M., Lim, W. H., & Ang, K. M. (2022). Differential evolution: A recent review based on state-of-the-art works. Alexandria Engineering Journal, 61(5), 3831–3872. doi:10.1016/j.aej.2021.09.013.

Bilal, Pant, M., Zaheer, H., Garcia-Hernandez, L., & Abraham, A. (2020). Differential Evolution: A review of more than two decades of research. Engineering Applications of Artificial Intelligence, 90, 103479. doi:10.1016/j.engappai.2020.103479.

de Werra, D., & Hertz, A. (1989). Tabu search techniques - A tutorial and an application to neural networks. OR Spektrum, 11(3), 131–141. doi:10.1007/BF01720782.

Holland, J. H. (1992). Adaptation in Natural and Artificial Systems. MIT Press, Cambridge, United States. doi:10.7551/mitpress/1090.001.0001.

Storn, R., & Price, K. (1997). Differential Evolution - A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. Journal of Global Optimization, 11(4), 341–359. doi:10.1023/A:1008202821328.

Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of ICNN’95 - International Conference on Neural Networks. doi:10.1109/icnn.1995.488968.

Dorigo, M., Birattari, M., & Stutzle, T. (2006). Ant colony optimization. IEEE Computational Intelligence Magazine, 1(4), 28–39. doi:10.1109/mci.2006.329691.

Yang, X.-S. (2021). Firefly Algorithms. Nature-Inspired Optimization Algorithms, 123–139. doi:10.1016/b978-0-12-821986-7.00016-0.

Karaboga, D., & Basturk, B. (2007). A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization, 39(3), 459–471. doi:10.1007/s10898-007-9149-x.

Sheng, M., Wang, Z., Liu, W., Wang, X., Chen, S., & Liu, X. (2022). A particle swarm optimizer with multi-level population sampling and dynamic p-learning mechanisms for large-scale optimization. Knowledge-Based Systems, 242, 108382. doi:10.1016/j.knosys.2022.108382.

Ali, M. Z., Awad, N. H., Suganthan, P. N., Shatnawi, A. M., & Reynolds, R. G. (2018). An improved class of real-coded Genetic Algorithms for numerical optimization. Neurocomputing, 275, 155–166. doi:10.1016/j.neucom.2017.05.054.

Chu, X., Cai, F., Gao, D., Li, L., Cui, J., Xu, S. X., & Qin, Q. (2020). An artificial bee colony algorithm with adaptive heterogeneous competition for global optimization problems. Applied Soft Computing Journal, 93, 106391. doi:10.1016/j.asoc.2020.106391.

She, B., Fournier, A., Yao, M., Wang, Y., & Hu, G. (2022). A self-adaptive and gradient-based cuckoo search algorithm for global optimization [Formula presented]. Applied Soft Computing, 122, 108774. doi:10.1016/j.asoc.2022.108774.

Wu, J., Wang, Y. G., Burrage, K., Tian, Y. C., Lawson, B., & Ding, Z. (2020). An improved firefly algorithm for global continuous optimization problems. Expert Systems with Applications, 149, 113340. doi:10.1016/j.eswa.2020.113340.

Chen, Y., & Pi, D. (2020). An innovative flower pollination algorithm for continuous optimization problem. Applied Mathematical Modelling, 83, 237–265. doi:10.1016/j.apm.2020.02.023.

Sun, G., Yang, B., Yang, Z., & Xu, G. (2020). An adaptive differential evolution with combined strategy for global numerical optimization. Soft Computing, 24(9), 6277–6296. doi:10.1007/s00500-019-03934-3.

Deng, W., Shang, S., Cai, X., Zhao, H., Song, Y., & Xu, J. (2021). An improved differential evolution algorithm and its application in optimization problem. Soft Computing, 25(7), 5277–5298. doi:10.1007/s00500-020-05527-x.

Zeng, Z., Zhang, M., Chen, T., & Hong, Z. (2021). A new selection operator for differential evolution algorithm. Knowledge-Based Systems, 226, 107150. doi:10.1016/j.knosys.2021.107150.

Meng, Z., & Yang, C. (2022). Two-stage differential evolution with novel parameter control. Information Sciences, 596, 321–342. doi:10.1016/j.ins.2022.03.043.

Kumar, A., Biswas, P. P., & Suganthan, P. N. (2022). Differential evolution with orthogonal array‐based initialization and a novel selection strategy. Swarm and Evolutionary Computation, 68, 101010. doi:10.1016/j.swevo.2021.101010.

Houssein, E. H., Rezk, H., Fathy, A., Mahdy, M. A., & Nassef, A. M. (2022). A modified adaptive guided differential evolution algorithm applied to engineering applications. Engineering Applications of Artificial Intelligence, 113, 104920. doi:10.1016/j.engappai.2022.104920.

Deng, W., Ni, H., Liu, Y., Chen, H., & Zhao, H. (2022). An adaptive differential evolution algorithm based on belief space and generalized opposition-based learning for resource allocation. Applied Soft Computing, 127, 109419. doi:10.1016/j.asoc.2022.109419.

Yi, W., Chen, Y., Pei, Z., & Lu, J. (2022). Adaptive differential evolution with ensembling operators for continuous optimization problems. Swarm and Evolutionary Computation, 69, 100994. doi:10.1016/j.swevo.2021.100994.

Thanathamathee, P., & Sawangarreerak, S. (2022). Discovering Future Earnings Patterns through FP-Growth and ECLAT Algorithms with Optimized Discretization. Emerging Science Journal, 6(6), 1328-1345. doi:10.28991/ESJ-2022-06-06-07.

Farda, I., & Thammano, A. (2022). A Self-adaptive Differential Evolution Algorithm for Solving Optimization Problems. Lecture Notes in Networks and Systems, 453 LNNS, 68–76. doi:10.1007/978-3-030-99948-3_7.

Rahnamayan, R. S., Tizhoosh, H. R., & Salama, M. M. A. (2008). Opposition-based differential evolution. IEEE Transactions on Evolutionary Computation, 12(1), 64–79. doi:10.1109/TEVC.2007.894200.

Qin, A. K., Huang, V. L., & Suganthan, P. N. (2009). Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Transactions on Evolutionary Computation, 13(2), 398–417. doi:10.1109/TEVC.2008.927706.

Zhang, J., & Sanderson, A. C. (2009). JADE: Adaptive differential evolution with optional external archive. IEEE Transactions on Evolutionary Computation, 13(5), 945–958. doi:10.1109/TEVC.2009.2014613.


Full Text: PDF

DOI: 10.28991/HIJ-2023-04-02-014

Refbacks

  • There are currently no refbacks.


Copyright (c) 2023 Irfan Farda, Arit Thammano