Projectiles Optimization: A Novel Metaheuristic Algorithm for Global Optimization

Document Type: Original Article

Authors

Department of Computer Engineering and Information Technology, Razi University, Kermanshah, Iran

Abstract

Metaheuristic optimization algorithms are a relatively new class of optimization algorithms that are widely used for difficult optimization problems in which classic methods cannot be applied and are considered as known and very broad methods for crucial optimization problems. In this study, a new metaheuristic optimization algorithm is presented, the main idea of which is inspired by models in kinematics. This algorithm obtains better results compared to other optimization algorithms in this field and is able to explore new paths in its search for desirable points. Hence, after introducing the projectiles optimization (PRO) algorithm, in the first experiment, it is evaluated by the determined test functions of the IEEE congress on evolutionary computation (CEC) and compared with the known and powerful algorithms of this field. In the second try out, the performance of the PRO algorithm is measured in two practical applications, one for the training of the multi-layer perceptron (MLP) neural networks and the other for pattern recognition by Gaussian mixture modeling (GMM). The results of these comparisons are presented in various tables and figures. Based on the presented results, the accuracy and performance of the PRO algorithm are much higher than other existing methods.

Keywords


1.     Talbi, E.G., “Metaheuristics: From design to implementation”,Vol. 74, John Wiley & Sons. (2009). https://doi.org/10.1002/9780470496916
2.     Birattari, M., Paquete, L., Strutzle, T. and Varrentrapp, K., Classification of metaheuristics and design of experiments for the analysis of components. Technical Report No. AIDA-01-05, (2001). Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download? doi=10.1.1.12.4407&rep=rep1&type=pdf
3.     Boussaïd, I., Lepagnot, J. and Siarry, P., “A survey on optimization metaheuristics,” Information Sciences,  Vol. 237, (2013), 82-117. https://doi.org/10.1016/j.ins.2013.02.041
4.     Blum, C. and Roli, A., “Metaheuristics in combinatorial optimization: Overview and conceptual comparison,” ACM Computing Surveys (CSUR),  Vol. 35, No. 3, (2003), 268-308. https://doi.org/10.1145/937503.937505
5.     Bianchi, L., Dorigo, M., Gambardella, L.M. and Gutjahr, W.J., “A survey on metaheuristics for stochastic combinatorial optimization,” Natural Computing: an International Journal,  Vol. 8, No. 2, (2009), 239-287. https://doi.org/10.1007/s11047-008-9098-4
6.     Goldberg, D.E. and Deb, K., “A comparative analysis of selection schemes used in genetic algorithms,” Foundations of genetic algorithms,  Vol. 1, (1991), 69-93. https://doi.org/10.1016/b978-0-08-050684-5.50008-2
7.     Blickle, T. and Thiele, L., A comparison of selection schemes used in genetic algorithms. (1995), TIK-Report. https://doi.org/10.1162/evco.1996.4.4.361
8.     Beasley, D., Bull, D.R. and Martin, R.R., “An overview of genetic algorithms: Part 2, research topics,” University computing,  Vol. 15, No. 4, (1993), 170-181. Retrieved from http://citeseer.ist.psu.edu/16527.html
9.     Becerra, R.L. and Coello, C.A.C., A cultural algorithm with differential evolution to solve constrained optimization problems, in Advances in Artificial Intelligence–IBERAMIA 2004. Springer, (20040, 881-890. https://doi.org/10.1007/978-3-540-30498-2_88
10.   Konak, A., Coit, D.W. and Smith, A.E., “Multi-objective optimization using genetic algorithms: A tutorial,” Reliability Engineering & System Safety,  Vol. 91, No. 9, (2006), 992-1007. https://doi.org/10.1016/j.ress.2005.11.018
11.   Alba, E. and Troya, J.M., “A survey of parallel distributed genetic algorithms,” Complexity,  Vol. 4, No. 4, (1999), 31-52. https://doi.org/10.1002/(SICI)1099-0526(199903/04)4:43.0.CO;2-4
12.   Elsayed, S.M., Sarker, R.A. and Essam, D.L., “A new genetic algorithm for solving optimization problems,” Engineering Applications of Artificial Intelligence,  Vol. 27, (2014), 57-69. https://doi.org/10.1016/j.engappai.2013.09.013
13.   Goldberg, D.E., “Genetic algorithms in search optimization and machine learning,” Addison-Wesley Reading Menlo Park,  Vol. 412,  (1989). https://doi.org/10.5860/choice.27-0936
14.   Kirkpatrick, S. and Vecchi, M.P., “Optimization by simulated annealing,” Science,  Vol. 220, No. 4598, (1983), 671-680. https://doi.org/10.1126/science.220.4598.671
15.   Černý, V., “Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm,” Journal of Optimization Theory and Applications,  Vol. 45, No. 1, (1985), 41-51. https://doi.org/10.1007/BF00940812
16.   Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H. and Teller, E., “Equation of state calculations by fast computing machines,” The Journal of Chemical Physics,  Vol. 21, No. 6, (1953), 1087-1092. https://doi.org/10.1063/1.1699114
17.   Creutz, M., “Microcanonical Monte Carlo simulation,” Physical Review Letters,  Vol. 50, No. 19, (1983), 1411-1414. https://doi.org/10.1103/PhysRevLett.50.1411
18.   Dueck, G. and Scheuer, T., “Threshold accepting: A general-purpose optimization algorithm appearing superior to simulated annealing,” Journal of Computational Physics,  Vol. 90, No. 1, (1990), 161-175. https://doi.org/10.1016/0021-9991(90)90201-B
19.   Dréo, J., Petrowski, A., Siarry, P. and Taillard, E., “Metaheuristics for hard optimization: Methods and case studies, Springer Science & Business Media, (2006).
20.   Charon, I. and Hudry, O., “The noising method: A new method for combinatorial optimization,” Operations Research Letters, Vol. 14, No. 3, (1993), 133-137. https://doi.org/10.1016/0167-6377(93)90023-A
21.   Charon, I. and Hudry, O., “The noising methods: A generalization of some metaheuristics,” European Journal of Operational Research,  Vol. 135, No. 1, (2001), 86-101. https://doi.org/10.1016/S0377-2217(00)00305-2
22.   Charon, I. and Hudry, O., The noising methods: A survey, in Essays and surveys in metaheuristics. 2002, Springer.245-261. https://doi.org/10.1007/978-1-4615-1507-4
23.   Charon, I. and Hudry, O., “Self-tuning of the noising methods,” Optimization,  Vol. 58, No. 7, (2009), 823-843. https://doi.org/10.1080/02331930902944911
24.   Courat, J.-P., Raynaud, G., Mrad, I. and Siarry, P., “Electronic component model minimization based on log simulated annealing,” IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications,   Vol. 41, No. 12, (1994), 790-795. https://doi.org/10.1109/81.340841
25.   Jeong, I.-K. and Lee, J.-J., “Adaptive simulated annealing genetic algorithm for system identification,” Engineering Applications of Artificial Intelligence,  Vol. 9, No. 5, (1996), 523-532. https://doi.org/10.1016/0952-1976(96)00049-8
26.   Suman, B. and Kumar, P., “A survey of simulated annealing as a tool for single and multiobjective optimization,” Journal of the Operational Research Society, Vol. 57, No. 10, (2006), 1143-1160. https://doi.org/10.1057/palgrave.jors.2602068
27.   Chopard, B., and Tomassini, M., “Simulated annealing, Natural Computing Series (First Edition). IN-TECH Education and Publishing, (2018). https://doi.org/10.1007/978-3-319-93073-2_4
28.   Hasani, A. and Soltani, R., “A hybrid meta-heuristic for the dynamic layout problem with transportation system design,” International Journal of Engineering - Transactions B: Applications,  Vol. 28, No. 8, (2015), 1175-1185. https://doi.org/10.5829/idosi.ije.2015.28.08b.10
29.   Fallah, M., Mohajeri, A. and Barzegar-Mohammadi, M., “A new mathematical model to optimize a green gas network: A case study,” In CIE 2016: 46th International Conferences on Computers and Industrial Engineering, 1142–1149. Retrieved from https://www.researchgate.net/publication/320920905
30.   Chan, F.T. and Tiwari, M.K., “Swarm Intelligence: Focus on Ant and Particle Swarm Optimization, IntechOpen, (2007). https://doi.org/10.5772/5121
31.   Saenphon, T., Phimoltares, S. and Lursinsap, C., “Combining new fast opposite gradient search with ant colony optimization for solving travelling salesman problem,” Engineering Applications of Artificial Intelligence,  Vol. 35, (2014), 324-334. https://doi.org/10.1016/j.engappai.2014.06.026
32.   Dorigo, M. and Stutzle, T., The ant colony optimization metaheuristic: Algorithms, applications, and advances, In Handbook of metaheuristics. Springer, (2003), 250-285. https://doi.org/10.7551/mitpress/1290.003.0004
33.   Dorigo, M., Birattari, M. and Stutzle, T., “Ant colony optimization,” IEEE Computational Intelligence Magazine,  Vol. 1, No. 4, (2006), 28-39. https://doi.org/10.1109/MCI.2006.329691
34.   Dorigo, M. and Stützle, T., Ant colony optimization: Overview and recent advances, In International Series in Operations Research and Management Science (Vol. 272), Handbook of metaheuristics, Springer, 311–351. https://doi.org/10.1007/978-3-319-91086-4_10
35.   Mohajeri, A., Mahdavi, I., Mahdavi-Amiri, N. and Tafazzoli, R., “Optimization of tree-structured gas distribution network using ant colony optimization: A case study,” International Journal of Engineering - Transactions A: Basics, Vol. 25, No. 2, (2012), 141-158. https://doi.org/10.5829/idosi.ije.2012.25.02a.04
36.   Eberhart, R.C. and Kennedy, J., “A new optimizer using particle swarm theory,” In Proceedings of the International Symposium on Micro Machine and Human Science, Vol. 1, IEEE, (1995), 39-43. https://doi.org/10.1109/mhs.1995.494215
37.   Kennedy, J., Eberhart, R.C. and Shi, Y., “Swarm Intelligence”, IEEE Technology and Society Magazine (Vol. 21). Morgan Kaufmann. IEEE, (2002). https://doi.org/10.1109/MTAS.2002.993595
38.   Kennedy, J. and Mendes, R., “Population structure and particle swarm performance,” In The 2002 IEEE Congress on Evolutionary Computation, CEC'02, (2002), 1671-1676. https://doi.org/10.1109/CEC.2002.1004493
39.   Gulcu, S. and Kodaz, H., “A novel parallel multi-swarm algorithm based on comprehensive learning particle swarm optimization,” Engineering Applications of Artificial Intelligence,  Vol. 45, (2015), 33-45. https://doi.org/10.1016/j.engappai.2015.06.013
40.   Shi, Y. and Eberhart, R., “A modified particle swarm optimizer,” In The 1998 IEEE International Conference on Evolutionary Computation, CEC'98, (1998), 69-73. https://doi.org/10.1109/ICEC.1998.699146
41.   Shi, Y. and Eberhart, R.C., “Empirical study of particle swarm optimization,” In The 1999 IEEE Congress on Evolutionary Computation , CEC'99, (1999), 1945-1950. https://doi.org/10.1109/CEC.1999.785511
42.   Clerc, M. and Kennedy, J., “The particle swarm-explosion, stability, and convergence in a multidimensional complex space,” IEEE Transactions on Evolutionary Computation, Vol. 6, No. 1, (2002), 58-73. https://doi.org/10.1109/4235.985692
43.   Ozcan, E. and Mohan, C.K., “Particle swarm optimization: Surfing the waves,” in The 1999 IEEE Congress on Evolutionary Computation, CEC'99, (1999), 1939-1944. https://doi.org/10.1109/CEC.1999.785510
44.   Van den Bergh, F. and Engelbrecht, A.P., “A study of particle swarm optimization particle trajectories,” Information Sciences, Vol. 176, No. 8, (2006), 937-971. https://doi.org/10.1016/j.ins.2005.02.003
45.   Kennedy, J. and Eberhart, R.C., “A discrete binary version of the particle swarm algorithm,” In The 1997 IEEE International Conference on Systems, Man, and Cybernetics, (1997), 4104-4108. https://doi.org/10.1109/icsmc.1997.637339
46.   Blackwell, T., Particle swarm optimization in dynamic environments, In Evolutionary computation in dynamic and uncertain environments. 2007, Springer. 29-49. https://doi.org/10.1007/978-3-540-49774-5_2
47.   Jam, S., Shahbahrami, A. and Sojoudi Ziyabari, S., “Parallel implementation of particle swarm optimization variants using graphics processing unit platform,” International Journal of Engineering - Transactions A: Basics, Vol. 30, No. 1, (2017), 48-56. https://doi.org/10.5829/idosi.ije.2017.30.01a.07
48.   Reyes-Sierra, M. and Coello, C.C., “Multi-objective particle swarm optimizers: A survey of the state-of-the-art,” International Journal of Computational Intelligence Research,  Vol. 2, No. 3, (2006), 287-308. https://doi.org/10.5019/j.ijcir.2006.68
49.   Ling, S.H., Chan, K.Y., Leung, F.H.F., Jiang, F. and Nguyen, H., “Quality and robustness improvement for real-world industrial systems using a fuzzy particle swarm optimization,” Engineering Applications of Artificial Intelligence,  Vol. 47, (2016), 68-80. https://doi.org/10.1016/j.engappai.2015.03.003
50.   Mahmoodabadi, M., Taherkhorsandi, M. and Safikhani, H., “Modeling and hybrid pareto optimization of cyclone separators using group method of data handling (gmdh) and particle swarm optimization (pso),” International Journal of Engineering - Transactions C: Aspects,  Vol. 26, No. 9, (2012), 1089-1102. https://doi.org/10.5829/idosi.ije.2013.26.09c.15
51.   Valdez, F., Melin, P. and Castillo, O., “An improved evolutionary method with fuzzy logic for combining particle swarm optimization and genetic algorithms,” Applied Soft Computing,  Vol. 11, No. 2, (2011), 2625-2632. https://doi.org/10.1016/j.asoc.2010.10.010
52.   Poli, R., “Analysis of the publications on the applications of particle swarm optimisation,” Journal of Artificial Evolution and Applications, Vol. 2008, (2008), 1-10. https://doi.org/10.1155/2008/685175
53.   Poli, R., Kennedy, J. and Blackwell, T., “Particle swarm optimization,” Swarm Intelligence,  Vol. 1, No. 1, (2007), 33-57. https://doi.org/10.1007/s11721-007-0002-0
54.   Pant, M., Thangaraj, R. and Abraham, A., Particle swarm optimization: Performance tuning and empirical analysis, In Foundations of computational intelligence. 2009, Springer. 101-128. https://doi.org/10.1007/978-3-642-01085-9_5
55.   Thangaraj, R., Pant, M., Abraham, A. and Bouvry, P., “Particle swarm optimization: Hybridization perspectives and experimental illustrations,” Applied Mathematics and Computation,  Vol. 217, No. 12, (2011), 5208-5226. https://doi.org/10.1016/j.amc.2010.12.053
56.   Deepa, S., Babu, S.R. and Ranjani, M., “A robust statcom controller using particle swarm optimization,” International Journal of Engineering - Transactions B: Applications,  Vol. 27, No. 5, (2013), 731-738. https://doi.org/10.5829/idosi.ije.2014.27.05b.08
57.   Daliri, H., Mokhtari, H. and Nakhai, I., “A particle swarm optimization approach to joint location and scheduling decisions in a flexible job shop environment,” International Journal of Engineering- Transaction C: Aspects, Vol. 28, No. 12, (2015), 1756-1764. https://doi.org/10.5829/idosi.ije.2015.28.12c.08
58.   Storn, R. and Price, K., “Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization,  Vol. 11, No. 4, (1997), 341-359. https://doi.org/10.1023/A:1008202821328
59.   Mezura-Montes, E., Reyes-Sierra, M. and Coello, C.A.C., Multi-objective optimization using differential evolution: A survey of the state of the art, In Advances in differential evolution. Vol. 143, (2008), Springer. 173-196. https://doi.org/10.1007/978-3-540-68830-3_7
60.   Amirian, H. and Sahraeian, R., “Multi-objective differential evolution for the flow shop scheduling problem with a modified learning effect,” International Journal of Engineering - Transactions C: Aspects,  Vol. 27, No. 9, (2014), 1395-1404. https://doi.org/10.5829/idosi.ije.2014.27.09c.09
61.   Angeline, P.J., “Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences,” In Evolutionary Programming VII, Springer. (1998), 601-610. https://doi.org/10.1007/bfb0040811
62.   Price, K., Storn, R.M. and Lampinen, J.A., “Differential evolution: A practical approach to global optimization, Springer Science & Business Media,  (2006).
63.   Das, S. and Suganthan, P.N., “Differential evolution: A survey of the state of the art,” IEEE Transactions on Evolutionary Computation, Vol. 15, No. 1, (2011), 4-31. https://doi.org/10.1109/TEVC.2010.2059031
64.   Karci, A., Imitation of bee reproduction as a crossover operator in genetic algorithms, in PRICAI 2004: Trends in artificial intelligence. (Vol. 3157), Springer, (2004). 1015-1016. https://doi.org/10.1007/978-3-540-28633-2_141
65.   Brest, J. and Maučec, M.S., “Self-adaptive differential evolution algorithm using population size reduction and three strategies,” Soft Computing,  Vol. 15, No. 11, (2011), 2157-2174. https://doi.org/10.1007/s00500-010-0644-5
66.   Teng, N.S., Teo, J. and Hijazi, M.H.A., “Self-adaptive population sizing for a tune-free differential evolution,” Soft Computing, Vol. 13, No. 7, (2009), 709-724. https://doi.org/10.1007/s00500-008-0344-6
67.   Liu, J. and Lampinen, J., “A fuzzy adaptive differential evolution algorithm,” Soft Computing,  Vol. 9, No. 6, (2005), 448-462. https://doi.org/10.1007/s00500-004-0363-x
68.   Tummala, A.S., Chintala, M.R. and Pilla, R., “Tuning of extended kalman filter using self-adaptive differential evolution algorithm for
sensorless permanent magnet synchronous motor drive,” International Journal of Engineering - Transactions A: Basics, Vol. 29, No. 11, (2016), 1565-1573. https://doi.org/10.5829/idosi.ije.2016.29.11b.00
69.   Kahrizi, M.R. Projectiles optimization (pro) algorithm.  2017  [cited 2020 Jun. 22]; IEEE Dataport. Available from: https://doi.org/10.21227/H2TK92
70.   Awad, N.H., Ali, M.Z., Liang, J.J., Qu, B.Y. and Suganthan, P.N., Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective bound constrained real-parameter numerical optimization. 2016. Available: https://www.ntu.edu.sg/home/EPNSugan/index_files/CEC2017/CEC2017.htm. Accessed: 21/06/2020.
71.   Haykin, S., “Neural networks: A comprehensive foundation, Prentice Hall PTR,  (1994).
72.   Reynolds, D.A. and Rose, R.C., “Robust text-independent speaker identification using Gaussian mixture speaker models,” IEEE Transactions on Speech and Audio Processing,  Vol. 3, No. 1, (1995), 72-83. https://doi.org/10.1016/0167-6393(95)00009-D
73.   Kahrizi, M.R. and Kabudian, S.J., “Long-term spectral pseudo-entropy (ltspe): A new robust feature for speech activity detection,” Journal of Information Systems & Telecommunication (JIST),  Vol. 6, No. 4, (2018), 204-208. https://doi.org/10.7508/jist.2018.04.003
74.   Dempster, A.P., Laird, N.M. and Rubin, D.B., “Maximum likelihood from incomplete data via the em algorithm,” Journal of the Royal Statistical Society: Series B (Methodological),  Vol. 39, No. 1, (1977), 1-22. https://doi.org/10.1111/j.2517-6161.1977.tb01600.x