## Yongli Liu* and Renjie Li*## |

Function | Dim | Range | [TeX:] $$f_{\text {min }}$$ |
---|---|---|---|

[TeX:] $$F_{1}(x)=\sum_{i=1}^{n} x_{i}^{2}$$ | 30 | [-100,100] | 0 |

[TeX:] $$F_{2}(x)=\sum_{i=1}^{n}\left|x_{i}\right|+\prod_{i=1}^{n}\left|x_{i}\right|$$ | 30 | [-10,10] | |

[TeX:] $$F_{3}(x)=\sum_{i=1}^{n}\left(\sum_{j=1}^{i} x_{j}\right)^{2}$$ | 30 | [-100,100] | 0 |

[TeX:] $$F_{4}(x)=\max _{i}\left\{\left|x_{i}\right|, 1 \leq i \leq n\right\}$$ | 30 | [-100,100] | 0 |

[TeX:] $$F_{s}(x)=\sum_{i=1}^{n-1}\left[100\left(x_{i+1}-x_{i}^{2}\right)^{2}+\left(x_{i}+1\right)^{2}\right]$$ | 30 | [-30,30] | 0 |

[TeX:] $$F_{6}(x)=\sum_{i=1}^{n}\left(\left[x_{i}+0.5\right]\right)^{2}$$ | 30 | [-100,100] | 0 |

[TeX:] $$F_{7}(x)=\sum_{i=1}^{n} i x_{i}^{4}+\operatorname{rand}[0,1)$$ | 30 | [-1.28,1.28] | 0 |

Table 2.

Function | Dim | Range | [TeX:] $$f_{\text {min }}$$ |
---|---|---|---|

[TeX:] $$F_{8}(x)=\sum_{i=1}^{n}-x_{i} \sin (\sqrt{\left|x_{i}\right|})$$ | 30 | [-500,500] | -12569.5 |

[TeX:] $$F_{9}(x)=\sum_{i=1}^{n}\left[x_{i}^{2}-10 \cos \left(2 \pi x_{i}\right)+10\right]$$ | 30 | [-5.12,5.12] | 0 |

[TeX:] $$\begin{aligned} F_{10}(x) &=-20 \exp (-0.2 \sqrt{\left(\sum_{i=1}^{n} x_{i}^{2}\right) / n}) \\ &-\exp \left(\left(\sum_{i=1}^{n} \cos \left(2 \pi x_{i}\right)\right) / n\right)+20+e \end{aligned}$$ | 30 | [-32,32] | 0 |

[TeX:] $$F_{11}(x)=\frac{1}{4000} \sum_{i=1}^{n} x_{i}^{2}-\prod_{i=1}^{n} \cos \left(x_{i} / \sqrt{i}\right)+1$$ | 30 | [-600,600] | 0 |

[TeX:] $$\begin{aligned} F_{12}(x) &=(\pi / n)\left\{10 \sin \left(\pi y_{1}\right)\right.\\ &+\sum_{i=1}^{n-1}\left(y_{i}-1\right)^{2}\left[1+10 \sin ^{2}\left(\pi y_{i+1}\right)\right] \\ &\left.+\left(y_{n}-1\right)^{2}\right\}+\sum_{i=1}^{n} u\left(x_{i}, 10,100,4\right) \end{aligned} \\ y_{i}=1+0.25\left(x_{i}+1\right) \\ u\left(x_{i}, a, k, m\right)=\left\{\begin{array}{cc} k\left(x_{i}-a\right)^{m} & x_{i}>a \\ 0 & -a<x_{i}<a \\ k\left(-x_{i}-a\right)^{m} & x_{i}<-a \end{array}\right.$$ | 30 | [-50,50] | 0 |

[TeX:] $$\begin{aligned} F_{13}(x)=& 0.1\left\{\sin ^{2}\left(3 \pi x_{1}\right)\right.\\ &+\sum_{i=1}^{n}\left(x_{i}-1\right)^{2}\left[1+\sin ^{2}\left(3 \pi x_{i}+1\right)\right] \end{aligned} \\ \begin{array}{l} \left.+\left(x_{n}-1\right)^{2}\left[1+\sin ^{2}\left(2 \pi x_{n}\right)\right]\right\} \\ +\sum_{i=1}^{n} u\left(x_{i}, 10,100,4\right) \end{array}$$ | 30 | [-50,50] | 0 |

[TeX:] $$F_{14}(x)=\left(\frac{1}{500}+\sum_{j=1}^{25} \frac{1}{j+\sum_{i=1}^{2}\left(x_{i}-a_{i j}\right)^{6}}\right)^{-1}$$ | 2 | [-65,65] | 1 |

[TeX:] $$F_{15}(x)=\sum_{i=1}^{11}\left[a_{i}-\frac{x_{i}\left(b_{i}^{2}+b_{i} x_{2}\right)}{b_{i}^{2}+b_{i} x_{3}+x_{4}}\right]^{2}$$ | 4 | [-5,5] | 0.00030 |

[TeX:] $$F_{16}(x)=4 x_{1}^{2}-2.1 x_{1}^{2}+\frac{1}{3} x_{1}^{6}+x_{1} x_{2}-4 x_{2}^{2}+4 x_{2}^{4}$$ | 2 | [-5,5] | -1.0316 |

[TeX:] $$\begin{aligned} F_{17}(x)=&\left(x_{2}-\frac{5.1}{4 \pi^{2}} x_{1}^{2}+\frac{\pi}{5} x_{1}-6\right)^{2} \\ &+10\left(1-\frac{1}{8 \pi}\right) \cos \left(x_{1}\right)+10 \end{aligned}$$ | 2 | [-5,5] | 0.398 |

[TeX:] $$\begin{aligned} F_{18}(x)=&\left[1+\left(x_{1}+x_{2}+1\right)^{2}\left(19-14 x_{1}+3 x_{1}^{2}\right.\right.\\ &\left.\left.-14 x_{2}+6 x_{1} x_{2}+3 x_{2}^{2}\right)\right] \times\left[30+\left(2 x_{1}-3 x_{2}\right)^{2}\right. \end{aligned} \\ \left.\left(18-32 x_{1}+12 x_{1}^{2}+48 x_{2}-36 x_{1} x_{2}+27 x_{2}^{2}\right)\right]$$ | 2 | [-2,2] | 3 |

[TeX:] $$F_{19}(x)=-\sum_{i=1}^{4} c_{i} \exp \left(-\sum_{j=1}^{3} a_{i j}\left(x_{j}-p_{i j}\right)^{2}\right)$$ | 3 | [1,3] | -3.86 |

[TeX:] $$F_{20}(x)=-\sum_{i=1}^{4} c_{i} \exp \left(-\sum_{j=1}^{6} a_{i j}\left(x_{j}-p_{i j}\right)^{2}\right)$$ | 6 | [0,1] | -3.32 |

[TeX:] $$F_{21}(x)=-\sum_{i=1}^{5}\left[\left(X-a_{i}\right)\left(X-a_{i}\right)^{T}+c_{i}\right]^{-1}$$ | 4 | [0,10] | -10.1532 |

[TeX:] $$F_{22}(x)=-\sum_{i=1}^{7}\left[\left(X-a_{i}\right)\left(X-a_{i}\right)^{T}+c_{i}\right]^{-1}$$ | 4 | [0,10] | -10.4028 |

[TeX:] $$F_{23}(x)=-\sum_{i=1}^{10}\left[\left(X-a_{i}\right)\left(X-a_{i}\right)^{T}+c_{i}\right]^{-1}$$ | 4 | [0,10] | -10.5363 |

**Gravitational Search Algorithm**

GSA was proposed by the law of interaction between gravity and mass. During the execution of GSA, let [TeX:] $$x_{i}^{d}$$ be the position of the ith search-agent in the d-th dimension and [TeX:] $$X_{i}=\left(x_{i}^{1}, \ldots, x_{i}^{d}, \ldots, x_{i}^{n}\right)$$ be the position of the ith search-agent for i=1,2…,N, the calculation formula of the interaction force and the location and speed of the search-agent as follow.

The force [TeX:] $$F_{i j}^{d}(t)$$ on mass i from mass j and the total force on search-agent i in the dth dimension at time t are defined as follows:

where [TeX:] $$M_{a j}$$ denotes the active gravitational mass associated with search-agent j, [TeX:] $$M_{p i}$$ is the passive gravitational mass associated with search-agent [TeX:] $$I, G(t)=G\left(G_{0}, t\right)$$ is gravitational constant, [TeX:] $$G_{0}$$ is the original value, [TeX:] $$\varepsilon$$ is a sufficiently small positive number [TeX:] $$\boldsymbol{R}_{i j}(t)=\left\|X_{i}(t), X_{j}(t)\right\|_{2}$$ counts the Euclidian distance between two search-agent i and j, and [TeX:] $$\text {rand}_{j}$$ is a random constant in [0, 1].

The location and speed of the search-agent at time t are defined as follows:

where [TeX:] $$a_{i}^{d}(t)=F_{i}^{d}(t) / \mathrm{M}_{i i}(t)$$ [TeX:] $$M_{i i}(t)$$ is the inertial mass of the ith search-agent, rand denotes the uniform random variable in [0, 1].

[TeX:] $$\mathrm{F}_{1}-\mathrm{F}_{7}$$ are unimodal benchmark functions. They can be used to evaluate the algorithm's local search capability since they have only one global best solution. As shown by Fig. 4, average best-so-far (average number of the best solution obtained so far in each iteration) of 30 runs, PSA for unimodal function has not only good convergence but also has high convergence rate. Compared with other algorithms, it can get closer to the global optimum in fewer iterations. This means that PSA has better performance handling the problems that require less computing time.

Statistical results for 30 runs of F1–F7 are listed in Table 3. Best, Avg, and Std represent best fitness ever found, average fitness and corresponding standard deviation, respectively. In parallel comparison, bold font represents the best effect. If there is no bold font in parallel comparison, this means that all algorithms perform well.

Table 3.

PSA | WOA | GSA | PSO | ||
---|---|---|---|---|---|

[TeX:] $$F_{1}$$ | Best | 0.0934 | 5.85E-52 | 16.4172 | 0.0413 |

Avg | 15.3222 | 2.33E-35 | 736.3204 | 1667.6904 | |

Std | 27.3389 | 9.02E-35 | 428.3625 | 4610.9635 | |

[TeX:] $$F_{2}$$ | Best | 0.6680 | 2.62E-32 | 0.0203 | 10.0903 |

Avg | 2.2314 | 2.84E-27 | 4.0400 | 27.4738 | |

Std | 1.5088 | 1.05E-26 | 3.4989 | 15.4874 | |

[TeX:] $$F_{3}$$ | Best | 3.9671 | 47913.1934 | 526.6285 | 4014.6458 |

Avg | 3978.0837 | 74924.1412 | 2532.8313 | 21258.0532 | |

Std | 3718.9156 | 18028.0192 | 1080.6073 | 8626.2509 | |

[TeX:] $$F_{4}$$ | Best | 0.0746 | 5.4675 | 9.9723 | 23.4041 |

Avg | 1.1947 | 63.3774 | 15.5021 | 34.9788 | |

Std | 1.0316 | 22.6286 | 2.9599 | 7.9321 | |

[TeX:] $$F_{5}$$ | Best | 8.9635 | 27.7872 | 45.2700 | 32.7953 |

Avg | 332.6410 | 28.5011 | 1515.3686 | 6841.2273 | |

Std | 705.1589 | 0.2887 | 2128.6598 | 22676.7468 | |

[TeX:] $$F_{6}$$ | Best | 0 | 0 | 297 | 0 |

Avg | 19.8667 | 0 | 1276.2 | 1012 | |

Std | 33.4589 | 0 | 562.2334 | 3048.7384 | |

[TeX:] $$F_{7}$$ | Best | 0.0017 | 0.0003 | 0.1072 | 0.0730 |

Avg | 0.0237 | 0.0109 | 0.2579 | 0.9833 | |

Std | 0.0170 | 0.0184 | 0.1127 | 2.2264 |

Obviously, PSA is very competitive when compares with other algorithms in dealing with functions [TeX:] $$\mathrm{F}_{3}-\mathrm{F}_{7}.$$ Especially, its test results under functions [TeX:] $$\mathrm{F}_{3}-\mathrm{F}_{5}$$ are significantly better than those of other algorithms, and it can achieve the accuracy that other algorithms are difficult to achieve. Although the results of dealing with F1 and F2 are not best, PSA still maintains considerable accuracy.

With the aim of analyzing the performance of algorithms, it is first necessary to be clear that the search process of most optimization algorithms is divided into two main stages: exploration and exploitation [3]. The exploration phase is responsible for letting the algorithm determine the approximate location of the global optimization, while the exploitation phase is responsible for allowing the algorithm to dig deeper into the global optimal solution. PSA also has these two phases. Through further analysis, since the PSA algorithm itself has good convergence performance, it can quickly find the region where the global optimum is located. It is not practical to get most of the algorithms to perform well in exploration and exploitation at the same time. It can be known from Eq. (9) that after the Observation step, the location of the search agent will change around the current location, to avoid falling into local optimum. This step also causes the search agent to shift around the global best, causing some bias. The Observation step is to obtain excellent exploration performance, avoid falling into local optimum, and the effect is visible. The algorithm obtains excellent convergence performance but sacrifices some exploitation per formance. This also leads to the results of Table 3. For F1, F2, F6, and F7, PSA has excellent performance, and it can stably find the position of the global optimal solution in multiple iterations, showing excellent exploration performance, but due to Exploitation performance is slightly insufficient, so PSA cannot dig deeper into the global optimal solution as WOA does. For F3, F4, F5, we can easily find that these three functions may have a lower gradient or even a zero gradient during the search process. At this time, the PSA's exploration performance is fully reflected.

Finally, considering the convergence of the algorithm, it can be said that PSA performs quite well when dealing with unimodal functions.

[TeX:] $$\mathrm{F}_{8}-\mathrm{F}_{23}$$ are multimodal benchmark functions. Most of the multimodal functions are complex in structure and have many local optimal values, which have a considerable test for the convergence performance of the algorithm. However, these multimodal functions are more closely related to the models of real-life problems, and in other words, they can be used to evaluate the robustness of meta-heuristic algorithms.

Table 4 shows that the multimodal function test results, compared with other algorithms, PSA in function [TeX:] $$\mathrm{F}_{8} / \mathrm{F}_{12} / \mathrm{F}_{14} / \mathrm{F}_{16} / \mathrm{F}_{21} / \mathrm{F}_{22} / \mathrm{F}_{23}$$ have remarkable performance. In the test of these functions, PSA not only found a better solution but also maintained a lower average value and standard deviation, which showed strong stability. Although PSA did not get the best result in the test of [TeX:] $$\mathrm{F}_{9} / \mathrm{F}_{10} / \mathrm{F}_{11} / \mathrm{F}_{13} / \mathrm{F}_{15} / \mathrm{F}_{17} /\mathrm{F}_{18} / \mathrm{F}_{19} / \mathrm{F}_{20}$$ functions, it still maintains high accuracy and stability in these function tests.

Table 4.

PSA | WOA | GSA | PSO | ||
---|---|---|---|---|---|

[TeX:] $$\mathrm{F}_{8}$$ | Best | -12569.2458 | -12569.3352 | -3126.6594 | -10176.3344 |

Avg | -11648.5512 | -9811.9027 | -2368.9008 | -8807.4347 | |

Std | 1230.4314 | 1863.4704 | 438.0826 | 699.2045 | |

[TeX:] $$\mathrm{F}_{9}$$ | Best | 0.5065 | 0 | 20.8941 | 53.8434 |

Avg | 7.3763 | 0 | 43.7314 | 135.6683 | |

Std | 9.1989 | 0 | 11.7495 | 42.8937 | |

[TeX:] $$\mathrm{F}_{10}$$ | Best | 0.3610 | 8.88E-16 | 3.3895 | 0.6043 |

Avg | 1.6766 | 6.10E-15 | 6.7118 | 10.1080 | |

Std | 0.9929 | 5.09E-15 | 1.9822 | 5.9491 | |

[TeX:] $$\mathrm{F}_{11}$$ | Best | 0.0073 | 0 | 125.3078 | 0.0003 |

Avg | 0.5294 | 0 | 168.2118 | 9.0238 | |

Std | 0.6102 | 0 | 22.7005 | 27.4559 | |

[TeX:] $$\mathrm{F}_{12}$$ | Best | 0.0014 | 0.0282 | 1.8786 | 0.7190 |

Avg | 0.1716 | 0.1073 | 6.2125 | 5.2367 | |

Std | 0.2706 | 0.0568 | 3.0004 | 2.3537 | |

[TeX:] $$\mathrm{F}_{13}$$ | Best | 0.0104 | 0.2457 | 6.6689 | -0.6639 |

Avg | 1.5458 | 0.9250 | 65.7351 | 11.3909 | |

Std | 3.3136 | 0.3604 | 136.0469 | 9.3172 | |

[TeX:] $$\mathrm{F}_{14}$$ | Best | 0.3769 | 0.3769 | 0.3769 | 0.3769 |

Avg | 0.4802 | 0.5973 | 0.7245 | 0.4276 | |

Std | 0.1158 | 0.3391 | 0.3414 | 0.1349 | |

[TeX:] $$\mathrm{F}_{15}$$ | Best | 0.0003 | 0.0003 | 0.0010 | 0.0003 |

Avg | 0.0077 | 0.0011 | 0.0063 | 0.0075 | |

Std | 0.0224 | 0.0016 | 0.0053 | 0.0093 | |

[TeX:] $$\mathrm{F}_{16}$$ | Best | -1.0316 | -1.0316 | -1.0316 | -1.0316 |

Avg | -1.0316 | -1.0316 | -1.0316 | -1.0316 | |

Std | 2.33E-07 | 3.58E-08 | 4.52E-16 | 6.18E-16 | |

[TeX:] $$\mathrm{F}_{17}$$ | Best | 0.3979 | 0.3979 | 0.3979 | 0.3979 |

Avg | 0.3979 | 0.3981 | 0.3979 | 0.3979 | |

Std | 1.41E-07 | 0.0003 | 0 | 0 | |

[TeX:] $$\mathrm{F}_{18}$$ | Best | 3 | 3 | 3 | 3 |

Avg | 3 | 5.7120 | 3 | 3 | |

Std | 1.36E-05 | 8.2702 | 7.36E-15 | 1.61E-15 | |

[TeX:] $$\mathrm{F}_{19}$$ | Best | -3.8628 | -3.8628 | -3.8628 | -3.8628 |

Avg | -3.8556 | -3.8162 | -3.8603 | -3.8623 | |

Std | 0.0153 | 0.1404 | 0.0050 | 0.0020 | |

[TeX:] $$\mathrm{F}_{20}$$ | Best | -3.3080 | -3.3143 | -3.3220 | -3.3220 |

Avg | -3.0403 | -3.2145 | -3.3071 | -3.2134 | |

Std | 0.1940 | 0.1050 | 0.0459 | 0.2948 | |

[TeX:] $$\mathrm{F}_{21}$$ | Best | -10.1531 | -10.1495 | -10.1532 | -10.1532 |

Avg | -9.7302 | -7.3740 | -5.9719 | -5.5585 | |

Std | 1.1347 | 2.5282 | 3.7303 | 3.2203 | |

[TeX:] $$\mathrm{F}_{22}$$ | Best | -10.4029 | -10.3394 | -10.4029 | -10.4029 |

Avg | -9.8628 | -6.1788 | -9.1374 | -5.1888 | |

Std | 1.2894 | 2.5013 | 2.5956 | 3.2755 | |

[TeX:] $$\mathrm{F}_{23}$$ | Best | -10.5364 | -10.5252 | -10.5364 | -10.5364 |

Avg | -9.8189 | -5.5032 | -8.1614 | -6.3000 | |

Std | 1.8027 | 2.7365 | 3.5931 | 3.8058 |

Combined with the test of unimodal functions, PSA has an excellent ability to explore so that it will perform well in the test of multimodal functions. However, in order to make the algorithm have a faster convergence speed, PSA must sacrifice some of the Exploitation ability, which also makes the performance of some functions, not the best, but the performance is still excellent.

The function [TeX:] $$\mathrm{F}_{8}-\mathrm{F}_{13}$$ are normal multi-peak function. For the functions [TeX:] $$\mathrm{F}_{8} \text { and } \mathrm{F}_{12},$$ the performance of the PSA is perfect, while for the [TeX:] $$\mathrm{F}_{9}-\mathrm{F}_{11} \text { and } \mathrm{F}_{13},$$ the performance is not the best, but it is maintained at a reasonable level. As can be seen from Table 4, the PSA is very close to the global best and maintains an excellent convergence performance, so that it can adapt to some computing scenarios that require less computing time.

Although the functions [TeX:] $$\mathrm{F}_{14}-\mathrm{F}_{23}$$ are multi-peak functions of fixed dimensions, the performance of each optimization algorithm is roughly the same as that of [TeX:] $$\mathrm{F}_{8}-\mathrm{F}_{13},$$ but it is worth noting that PSA performs better and more stable for the [TeX:] $$\mathrm{F}_{21}-\mathrm{F}_{23}$$ function.

As can be seen from Fig. 5, PSA still has good convergence for the function [TeX:] $$\mathrm{F}_{8}-\mathrm{F}_{19}$$ and has advantages over other algorithms. For [TeX:] $$\mathrm{F}_{20},$$ PSA can converge rapidly, but its accuracy is not very good. However, it performs very well for [TeX:] $$\mathrm{F}_{21}-\mathrm{F}_{23}.$$ As can be seen from the Figure, PSA maintains a perfect convergence, accuracy, and converge speed every time it searches in these functions.

It is known that unimodal functions have only one extreme value (minimum or maximum) within a given range. Therefore, it is not necessary to worry about whether the algorithm is stuck at a local extreme when exploring the search space. However, the convergence speed and accuracy of the algorithm should be given priority of considering. Based on the characteristics of the unimodal functions, we improve PSA and get an algorithm specifically for the unimodal function, which we called UFPSA.

For PSA, the only requirement is to modify the photon position formula and its updated position formula. The modifications are as follows.

UFPSA updates the location of photon “i” by following formula:

where De adjusts the convergence of the search process. After several experiments, let ext=2 and b=1.5 in this paper.

The position of the photon is calculated according to the following formula:

where Normrnd(t) refers to the random number created from the normal distribution with mean 0 and standard deviation 0.6.

Table 5.

[TeX:] $$\mathbf{F}_{1}$$ | [TeX:] $$\mathbf{F}_{2}$$ | [TeX:] $$\mathbf{F}_{3}$$ | [TeX:] $$\mathbf{F}_{4}$$ | [TeX:] $$\mathbf{F}_{5}$$ | [TeX:] $$\mathbf{F}_{6}$$ | [TeX:] $$\mathbf{F}_{7}$$ | ||
---|---|---|---|---|---|---|---|---|

UFPSA | Best | 7.69E-17 | 2.83E-09 | 2.00E-15 | 2.67E-09 | 0.029 | 0 | 6.47E-06 |

Avg | 1.19E-13 | 1.59E-07 | 4.55E-12 | 8.30E-08 | 20.6908 | 0 | 0 | |

Std | 2.03E-13 | 1.18E-07 | 1.14E-11 | 8.96E-08 | 10.6485 | 0 | 0 | |

PSA | Best | 0.0826 | 0.4588 | 0.4477 | 0.1689 | 3.0058 | 0 | 0.0059 |

Avg | 13.3908 | 1.7826 | 4588.8400 | 1.1565 | 153.8715 | 32.9667 | 0.0238 | |

Std | 16.0383 | 1.3826 | 5382.1010 | 0.9413 | 193.0914 | 45.1736 | 0.0155 |

**Experiments and Evaluations of UFPSA**

We used the same test environment to test UFPSA and PSA (Dim=30, N=20, MaxIter=300). Looking at Fig. 6 and Table 5, we can clearly find that UFPSA can converge faster and maintain higher accuracy when dealing with unimodal function compared with PSA. In the case that the dimension is equal to 30, UFPSA can converge to a higher accuracy in a very few iterations, which enables it to perform some optimization operations requiring higher time complexity.

This paper provides a brand new meta-heuristic algorithm PSA which based on the quantum properties of photon. According to the test results of 23 benchmark functions, note that PSA has perfect convergence and strong ability of global search. Further, we can see that PSA can always decrease to an acceptable accuracy within a few iterations, whether in unimodal or multimodal functions, what implies that the algorithm can achieve a better approximate solution in a shorter time. The strong global search capability also makes PSA more competitive in comparison with other algorithms. However, the PSA still has some room for improvement. For instance, while guaranteeing the convergence of PSA, the accuracy for solving multimodal functions could be further improved.

He is a Ph.D., associate professor, master tutor. In 2003 and 2010, he received his bachelor's degree and doctor's degree in engineering from Beihang University, and his main research fields are information retrieval, data mining and big data. He is a member of the American Computer Society (ACM), the Chinese Computer Society (CCF), and the backbone teacher of institutions of higher learning in Henan province.

- 1 J. Kennedy, R. Eberhart, "Particle swarm optimization," in
*Proceedings of the IEEE International Conference on Neural Networks*, Perth, Australia, 1995;pp. 1942-1948. custom:[[[-]]] - 2 D. Karaboga, B. Basturk, "A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,"
*Journal of Global Optimization*, vol. 39, no. 3, pp. 459-471, 2007.custom:[[[-]]] - 3 S. Mirjalili, A. Lewis, "The whale optimization algorithm,"
*Advances in Engineering Software*, vol. 95, pp. 51-67, 2016.doi:[[[10.1016/j.advengsoft.2016.01.008]]] - 4 M. Dorigo, M. Birattari, T. Stutzle, "Ant colony optimization,"
*IEEE Computational Intelligence Magazine*, vol. 1, no. 4, pp. 28-39, 2006.doi:[[[10.1007/s10710-005-2991-z]]] - 5 P. C. Pinto, T. A. Runkler, J. M. Sousa, "Wasp swarm algorithm for dynamic MAX-SAT problems,"
*in Adaptive and Natural Computing Algorithms. Heidelberg: Springer*, pp. 350-357, 2007.custom:[[[-]]] - 6 A. Mucherino, O. Seref, "Monkey search: a novel metaheuristic search for global optimization," in
*AIP Conference Proceedings*, 2007;vol. 953, no. 1, pp. 162-173. custom:[[[-]]] - 7 A. Sharma, A. Sharma, B. K. Panigrahi, D. Kiran, R. Kumar, "Ageist spider monkey optimization algorithm,"
*Swarm and Evolutionary Computation*, vol. 28, pp. 58-77, 2016.doi:[[[10.1016/j.swevo.2016.01.002]]] - 8 Q. Zhou, Y. Q. Zhou, "Wolf colony search algorithm based on leader strategy,"
*Application Research of Computers*, vol. 30, no. 9, pp. 2629-2632, 2013.custom:[[[-]]] - 9 C. R. Hwang, "Simulated Annealing: Theory and Applications by P. J. M. Van Laarhoven and E. H. Aarts, 1987,"
*Acta Applicandae Mathematica*, vol. 12, pp. 108-111, 1988.custom:[[[-]]] - 10 E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, "GSA: a gravitational search algorithm,"
*Information Sciences*, vol. 179, no. 13, pp. 2232-2248, 2009.doi:[[[10.1016/j.ins.2009.03.004]]] - 11 H. Shah-Hosseini, "Principal components analysis by the galaxy-based search algorithm: a novel metaheuristic for continuous optimization,"
*International Journal of Computational Science and Engineering*, vol. 6, no. 1-2, pp. 132-140, 2011.custom:[[[-]]] - 12 A. Kaveh, S. Talatahari, "A novel heuristic optimization method: charged system search,"
*Acta Mechanica*, vol. 213, no. 3-4, pp. 267-289, 2010.custom:[[[-]]] - 13 R. A. Formato, "Central force optimization: a new deterministic gradient-like optimization metaheuristic,"
*Opsearch*, vol. 46, no. 1, pp. 25-51, 2009.custom:[[[-]]] - 14 A. Hatamlou, "Black hole: a new heuristic optimization approach for data clustering,"
*Information Sciences*, vol. 222, pp. 175-184, 2013.doi:[[[10.1016/j.ins.2012.08.023]]] - 15 H. Huang, M. Zhu, J. Wang, "An improved artificial bee colony algorithm based on special division and intellective search,"
*Journal of Information Processing Systems*, vol. 15, no. 2, pp. 433-439, 2019.custom:[[[-]]] - 16 L. Zhao, Y. Long, "An improved PSO algorithm for the classification of multiple power quality disturbances,"
*Journal of Information Processing Systems*, vol. 15, no. 1, pp. 116-126, 2019.custom:[[[-]]] - 17 X. Song, M. Zhao, Q. Y an, S. Xing, "A high-efficiency adaptive artificial bee colony algorithm using two strategies for continuous optimization,"
*Swarm and Evolutionary Computation*, vol. 50, no. 100549, 2019.custom:[[[-]]] - 18 M. R. Chen, J. H. Chen, G. Q. Zeng, K. D. Lu, X. F. Jiang, "An improved artificial bee colony algorithm combined with extremal optimization and Boltzmann Selection probability,"
*Swarm and Evolutionary Computation*, vol. 49, pp. 158-177, 2019.custom:[[[-]]] - 19 M. Li, D. Lei, J. Cai, "Two-level imperialist competitive algorithm for energy-efficient hybrid flow shop scheduling problem with relative importance of objectives,"
*Swarm and Evolutionary Computation*, vol. 49, pp. 34-43, 2019.custom:[[[-]]] - 20 H. Kragh, "Max Planck: the reluctant revolutionary,"
*Physics World*, vol. 13, no. 12, pp. 31-36, 2000.custom:[[[-]]] - 21 A. Einstein, "Uber einen die erzeugung und verwandlung des lichtes betreffenden heuristischen gesichtspunkt,"
*Annalen der Physik*, vol. 322, no. 6, pp. 132-148, 1905.custom:[[[-]]] - 22 L. V. de Broglie, "On the theory of quanta,"
*Annales de Physique*, vol. 10, no. 3, pp. 22-128, 1925.custom:[[[-]]]