Projectiles Optimization: A Novel Metaheuristic Algorithm for Global Optimization

Document Type : Original Article


Department of Computer Engineering and Information Technology, Razi University, Kermanshah, Iran


Metaheuristic optimization algorithms are a relatively new class of optimization algorithms that are widely used for difficult optimization problems in which classic methods cannot be applied and are considered as known and very broad methods for crucial optimization problems. In this study, a new metaheuristic optimization algorithm is presented, the main idea of which is inspired by models in kinematics. This algorithm obtains better results compared to other optimization algorithms in this field and is able to explore new paths in its search for desirable points. Hence, after introducing the projectiles optimization (PRO) algorithm, in the first experiment, it is evaluated by the determined test functions of the IEEE congress on evolutionary computation (CEC) and compared with the known and powerful algorithms of this field. In the second try out, the performance of the PRO algorithm is measured in two practical applications, one for the training of the multi-layer perceptron (MLP) neural networks and the other for pattern recognition by Gaussian mixture modeling (GMM). The results of these comparisons are presented in various tables and figures. Based on the presented results, the accuracy and performance of the PRO algorithm are much higher than other existing methods.


1.     Talbi, E.G., “Metaheuristics: From design to implementation”,Vol. 74, John Wiley & Sons. (2009).
2.     Birattari, M., Paquete, L., Strutzle, T. and Varrentrapp, K., Classification of metaheuristics and design of experiments for the analysis of components. Technical Report No. AIDA-01-05, (2001). Retrieved from doi=
3.     Boussaïd, I., Lepagnot, J. and Siarry, P., “A survey on optimization metaheuristics,” Information Sciences,  Vol. 237, (2013), 82-117.
4.     Blum, C. and Roli, A., “Metaheuristics in combinatorial optimization: Overview and conceptual comparison,” ACM Computing Surveys (CSUR),  Vol. 35, No. 3, (2003), 268-308.
5.     Bianchi, L., Dorigo, M., Gambardella, L.M. and Gutjahr, W.J., “A survey on metaheuristics for stochastic combinatorial optimization,” Natural Computing: an International Journal,  Vol. 8, No. 2, (2009), 239-287.
6.     Goldberg, D.E. and Deb, K., “A comparative analysis of selection schemes used in genetic algorithms,” Foundations of genetic algorithms,  Vol. 1, (1991), 69-93.
7.     Blickle, T. and Thiele, L., A comparison of selection schemes used in genetic algorithms. (1995), TIK-Report.
8.     Beasley, D., Bull, D.R. and Martin, R.R., “An overview of genetic algorithms: Part 2, research topics,” University computing,  Vol. 15, No. 4, (1993), 170-181. Retrieved from
9.     Becerra, R.L. and Coello, C.A.C., A cultural algorithm with differential evolution to solve constrained optimization problems, in Advances in Artificial Intelligence–IBERAMIA 2004. Springer, (20040, 881-890.
10.   Konak, A., Coit, D.W. and Smith, A.E., “Multi-objective optimization using genetic algorithms: A tutorial,” Reliability Engineering & System Safety,  Vol. 91, No. 9, (2006), 992-1007.
11.   Alba, E. and Troya, J.M., “A survey of parallel distributed genetic algorithms,” Complexity,  Vol. 4, No. 4, (1999), 31-52.;2-4
12.   Elsayed, S.M., Sarker, R.A. and Essam, D.L., “A new genetic algorithm for solving optimization problems,” Engineering Applications of Artificial Intelligence,  Vol. 27, (2014), 57-69.
13.   Goldberg, D.E., “Genetic algorithms in search optimization and machine learning,” Addison-Wesley Reading Menlo Park,  Vol. 412,  (1989).
14.   Kirkpatrick, S. and Vecchi, M.P., “Optimization by simulated annealing,” Science,  Vol. 220, No. 4598, (1983), 671-680.
15.   Černý, V., “Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm,” Journal of Optimization Theory and Applications,  Vol. 45, No. 1, (1985), 41-51.
16.   Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H. and Teller, E., “Equation of state calculations by fast computing machines,” The Journal of Chemical Physics,  Vol. 21, No. 6, (1953), 1087-1092.
17.   Creutz, M., “Microcanonical Monte Carlo simulation,” Physical Review Letters,  Vol. 50, No. 19, (1983), 1411-1414.
18.   Dueck, G. and Scheuer, T., “Threshold accepting: A general-purpose optimization algorithm appearing superior to simulated annealing,” Journal of Computational Physics,  Vol. 90, No. 1, (1990), 161-175.
19.   Dréo, J., Petrowski, A., Siarry, P. and Taillard, E., “Metaheuristics for hard optimization: Methods and case studies, Springer Science & Business Media, (2006).
20.   Charon, I. and Hudry, O., “The noising method: A new method for combinatorial optimization,” Operations Research Letters, Vol. 14, No. 3, (1993), 133-137.
21.   Charon, I. and Hudry, O., “The noising methods: A generalization of some metaheuristics,” European Journal of Operational Research,  Vol. 135, No. 1, (2001), 86-101.
22.   Charon, I. and Hudry, O., The noising methods: A survey, in Essays and surveys in metaheuristics. 2002, Springer.245-261.
23.   Charon, I. and Hudry, O., “Self-tuning of the noising methods,” Optimization,  Vol. 58, No. 7, (2009), 823-843.
24.   Courat, J.-P., Raynaud, G., Mrad, I. and Siarry, P., “Electronic component model minimization based on log simulated annealing,” IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications,   Vol. 41, No. 12, (1994), 790-795.
25.   Jeong, I.-K. and Lee, J.-J., “Adaptive simulated annealing genetic algorithm for system identification,” Engineering Applications of Artificial Intelligence,  Vol. 9, No. 5, (1996), 523-532.
26.   Suman, B. and Kumar, P., “A survey of simulated annealing as a tool for single and multiobjective optimization,” Journal of the Operational Research Society, Vol. 57, No. 10, (2006), 1143-1160.
27.   Chopard, B., and Tomassini, M., “Simulated annealing, Natural Computing Series (First Edition). IN-TECH Education and Publishing, (2018).
28.   Hasani, A. and Soltani, R., “A hybrid meta-heuristic for the dynamic layout problem with transportation system design,” International Journal of Engineering - Transactions B: Applications,  Vol. 28, No. 8, (2015), 1175-1185.
29.   Fallah, M., Mohajeri, A. and Barzegar-Mohammadi, M., “A new mathematical model to optimize a green gas network: A case study,” In CIE 2016: 46th International Conferences on Computers and Industrial Engineering, 1142–1149. Retrieved from
30.   Chan, F.T. and Tiwari, M.K., “Swarm Intelligence: Focus on Ant and Particle Swarm Optimization, IntechOpen, (2007).
31.   Saenphon, T., Phimoltares, S. and Lursinsap, C., “Combining new fast opposite gradient search with ant colony optimization for solving travelling salesman problem,” Engineering Applications of Artificial Intelligence,  Vol. 35, (2014), 324-334.
32.   Dorigo, M. and Stutzle, T., The ant colony optimization metaheuristic: Algorithms, applications, and advances, In Handbook of metaheuristics. Springer, (2003), 250-285.
33.   Dorigo, M., Birattari, M. and Stutzle, T., “Ant colony optimization,” IEEE Computational Intelligence Magazine,  Vol. 1, No. 4, (2006), 28-39.
34.   Dorigo, M. and Stützle, T., Ant colony optimization: Overview and recent advances, In International Series in Operations Research and Management Science (Vol. 272), Handbook of metaheuristics, Springer, 311–351.
35.   Mohajeri, A., Mahdavi, I., Mahdavi-Amiri, N. and Tafazzoli, R., “Optimization of tree-structured gas distribution network using ant colony optimization: A case study,” International Journal of Engineering - Transactions A: Basics, Vol. 25, No. 2, (2012), 141-158.
36.   Eberhart, R.C. and Kennedy, J., “A new optimizer using particle swarm theory,” In Proceedings of the International Symposium on Micro Machine and Human Science, Vol. 1, IEEE, (1995), 39-43.
37.   Kennedy, J., Eberhart, R.C. and Shi, Y., “Swarm Intelligence”, IEEE Technology and Society Magazine (Vol. 21). Morgan Kaufmann. IEEE, (2002).
38.   Kennedy, J. and Mendes, R., “Population structure and particle swarm performance,” In The 2002 IEEE Congress on Evolutionary Computation, CEC'02, (2002), 1671-1676.
39.   Gulcu, S. and Kodaz, H., “A novel parallel multi-swarm algorithm based on comprehensive learning particle swarm optimization,” Engineering Applications of Artificial Intelligence,  Vol. 45, (2015), 33-45.
40.   Shi, Y. and Eberhart, R., “A modified particle swarm optimizer,” In The 1998 IEEE International Conference on Evolutionary Computation, CEC'98, (1998), 69-73.
41.   Shi, Y. and Eberhart, R.C., “Empirical study of particle swarm optimization,” In The 1999 IEEE Congress on Evolutionary Computation , CEC'99, (1999), 1945-1950.
42.   Clerc, M. and Kennedy, J., “The particle swarm-explosion, stability, and convergence in a multidimensional complex space,” IEEE Transactions on Evolutionary Computation, Vol. 6, No. 1, (2002), 58-73.
43.   Ozcan, E. and Mohan, C.K., “Particle swarm optimization: Surfing the waves,” in The 1999 IEEE Congress on Evolutionary Computation, CEC'99, (1999), 1939-1944.
44.   Van den Bergh, F. and Engelbrecht, A.P., “A study of particle swarm optimization particle trajectories,” Information Sciences, Vol. 176, No. 8, (2006), 937-971.
45.   Kennedy, J. and Eberhart, R.C., “A discrete binary version of the particle swarm algorithm,” In The 1997 IEEE International Conference on Systems, Man, and Cybernetics, (1997), 4104-4108.
46.   Blackwell, T., Particle swarm optimization in dynamic environments, In Evolutionary computation in dynamic and uncertain environments. 2007, Springer. 29-49.
47.   Jam, S., Shahbahrami, A. and Sojoudi Ziyabari, S., “Parallel implementation of particle swarm optimization variants using graphics processing unit platform,” International Journal of Engineering - Transactions A: Basics, Vol. 30, No. 1, (2017), 48-56.
48.   Reyes-Sierra, M. and Coello, C.C., “Multi-objective particle swarm optimizers: A survey of the state-of-the-art,” International Journal of Computational Intelligence Research,  Vol. 2, No. 3, (2006), 287-308.
49.   Ling, S.H., Chan, K.Y., Leung, F.H.F., Jiang, F. and Nguyen, H., “Quality and robustness improvement for real-world industrial systems using a fuzzy particle swarm optimization,” Engineering Applications of Artificial Intelligence,  Vol. 47, (2016), 68-80.
50.   Mahmoodabadi, M., Taherkhorsandi, M. and Safikhani, H., “Modeling and hybrid pareto optimization of cyclone separators using group method of data handling (gmdh) and particle swarm optimization (pso),” International Journal of Engineering - Transactions C: Aspects,  Vol. 26, No. 9, (2012), 1089-1102.
51.   Valdez, F., Melin, P. and Castillo, O., “An improved evolutionary method with fuzzy logic for combining particle swarm optimization and genetic algorithms,” Applied Soft Computing,  Vol. 11, No. 2, (2011), 2625-2632.
52.   Poli, R., “Analysis of the publications on the applications of particle swarm optimisation,” Journal of Artificial Evolution and Applications, Vol. 2008, (2008), 1-10.
53.   Poli, R., Kennedy, J. and Blackwell, T., “Particle swarm optimization,” Swarm Intelligence,  Vol. 1, No. 1, (2007), 33-57.
54.   Pant, M., Thangaraj, R. and Abraham, A., Particle swarm optimization: Performance tuning and empirical analysis, In Foundations of computational intelligence. 2009, Springer. 101-128.
55.   Thangaraj, R., Pant, M., Abraham, A. and Bouvry, P., “Particle swarm optimization: Hybridization perspectives and experimental illustrations,” Applied Mathematics and Computation,  Vol. 217, No. 12, (2011), 5208-5226.
56.   Deepa, S., Babu, S.R. and Ranjani, M., “A robust statcom controller using particle swarm optimization,” International Journal of Engineering - Transactions B: Applications,  Vol. 27, No. 5, (2013), 731-738.
57.   Daliri, H., Mokhtari, H. and Nakhai, I., “A particle swarm optimization approach to joint location and scheduling decisions in a flexible job shop environment,” International Journal of Engineering- Transaction C: Aspects, Vol. 28, No. 12, (2015), 1756-1764.
58.   Storn, R. and Price, K., “Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization,  Vol. 11, No. 4, (1997), 341-359.
59.   Mezura-Montes, E., Reyes-Sierra, M. and Coello, C.A.C., Multi-objective optimization using differential evolution: A survey of the state of the art, In Advances in differential evolution. Vol. 143, (2008), Springer. 173-196.
60.   Amirian, H. and Sahraeian, R., “Multi-objective differential evolution for the flow shop scheduling problem with a modified learning effect,” International Journal of Engineering - Transactions C: Aspects,  Vol. 27, No. 9, (2014), 1395-1404.
61.   Angeline, P.J., “Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences,” In Evolutionary Programming VII, Springer. (1998), 601-610.
62.   Price, K., Storn, R.M. and Lampinen, J.A., “Differential evolution: A practical approach to global optimization, Springer Science & Business Media,  (2006).
63.   Das, S. and Suganthan, P.N., “Differential evolution: A survey of the state of the art,” IEEE Transactions on Evolutionary Computation, Vol. 15, No. 1, (2011), 4-31.
64.   Karci, A., Imitation of bee reproduction as a crossover operator in genetic algorithms, in PRICAI 2004: Trends in artificial intelligence. (Vol. 3157), Springer, (2004). 1015-1016.
65.   Brest, J. and Maučec, M.S., “Self-adaptive differential evolution algorithm using population size reduction and three strategies,” Soft Computing,  Vol. 15, No. 11, (2011), 2157-2174.
66.   Teng, N.S., Teo, J. and Hijazi, M.H.A., “Self-adaptive population sizing for a tune-free differential evolution,” Soft Computing, Vol. 13, No. 7, (2009), 709-724.
67.   Liu, J. and Lampinen, J., “A fuzzy adaptive differential evolution algorithm,” Soft Computing,  Vol. 9, No. 6, (2005), 448-462.
68.   Tummala, A.S., Chintala, M.R. and Pilla, R., “Tuning of extended kalman filter using self-adaptive differential evolution algorithm for
sensorless permanent magnet synchronous motor drive,” International Journal of Engineering - Transactions A: Basics, Vol. 29, No. 11, (2016), 1565-1573.
69.   Kahrizi, M.R. Projectiles optimization (pro) algorithm.  2017  [cited 2020 Jun. 22]; IEEE Dataport. Available from:
70.   Awad, N.H., Ali, M.Z., Liang, J.J., Qu, B.Y. and Suganthan, P.N., Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective bound constrained real-parameter numerical optimization. 2016. Available: Accessed: 21/06/2020.
71.   Haykin, S., “Neural networks: A comprehensive foundation, Prentice Hall PTR,  (1994).
72.   Reynolds, D.A. and Rose, R.C., “Robust text-independent speaker identification using Gaussian mixture speaker models,” IEEE Transactions on Speech and Audio Processing,  Vol. 3, No. 1, (1995), 72-83.
73.   Kahrizi, M.R. and Kabudian, S.J., “Long-term spectral pseudo-entropy (ltspe): A new robust feature for speech activity detection,” Journal of Information Systems & Telecommunication (JIST),  Vol. 6, No. 4, (2018), 204-208.
74.   Dempster, A.P., Laird, N.M. and Rubin, D.B., “Maximum likelihood from incomplete data via the em algorithm,” Journal of the Royal Statistical Society: Series B (Methodological),  Vol. 39, No. 1, (1977), 1-22.