## Yanjiao Wang* , Huanhuan Tao* and Zhuang Ma*## |

Function | D | /% | LI | NI | NE | |
---|---|---|---|---|---|---|

g01 | 13 | 0.0111 | 9 | 0 | 0 | 6 |

g02 | 20 | 99.9971 | 0 | 2 | 0 | 1 |

g03 | 10 | 0.0000 | 0 | 0 | 1 | 1 |

g04 | 5 | 52.1230 | 0 | 6 | 0 | 2 |

g05 | 4 | 0.0000 | 2 | 0 | 3 | 3 |

g06 | 2 | 0.0066 | 0 | 2 | 0 | 2 |

g07 | 10 | 0.0003 | 3 | 5 | 0 | 6 |

g08 | 2 | 0.8560 | 0 | 2 | 0 | 0 |

g09 | 7 | 0.5121 | 0 | 4 | 0 | 2 |

g10 | 8 | 0.0010 | 3 | 3 | 0 | 6 |

g11 | 2 | 0.0000 | 0 | 0 | 1 | 1 |

g12 | 3 | 4.7713 | 0 | 9^3 | 0 | 0 |

g13 | 5 | 0.0000 | 0 | 0 | 3 | 3 |

The performance of proposed _SOSMS is tested on 13 well-known benchmark functions [16]. Table 1 exhibits the details of the 13 benchmark functions. In Table 1, D is the dimension of variables, is the estimated ratio between the feasible region and the search space, LI is the number of linear inequality constrains, NI is the number of nonlinear inequality constrains, NE is the number of nonlinear equality constrains, and a is the number of constrained active at the optimal solution.

In order to verify the performance of _SOSMS, _SOSMS is compared with _SOS, MABC [8], DE [9], IPES [11], and I-ABC [12]. _SOS is an improved SOS algorithm based on improved constrained of Section 4.2. The parameters of _SOSMS are set as follows: =0.0001, n=1.1, p1=0.8, p2=0.9. For each test function, each algorithm runs 30 times independently and the size of population of each algorithm is 50. DE is stopped when function evaluation times reach [TeX:] $$2.0 \times 10^{5}$$ , and the other algorithms are stopped when function evaluation times reach [TeX:] $$2.4 \times 10^{4}.$$ All the experiments are executed on Windows 8.1 with Interl Core i5-3337u CPU 1.8 GHz and 4.0 GB RAM.

The experimental results of _SOSMS and other algorithms are summarized in Table 2. The result of best, worst, mean and variance represent the best value, the worst value, the average value and standard deviation of 30 independent running results respectively. Where the first three indicators are used to compare the convergence accuracy, and the variance is used to compare the stability.

Table 2.

Function/Theoretical optimum | Algorithm | Best | Worst | Mean | Variance |
---|---|---|---|---|---|

g01/-15.000 | _SOSMS | -15.000 | -15.000 | -15.000 | 7.1E-18 |

_SOS | -14.995 | -14.849 | -14.957 | 5.0E-02 | |

DE | -15.000 | -15.000 | -15.000 | 5.8E-14 | |

I-ABC | -15.000 | -15.000 | -15.000 | 4.6E-16 | |

MABC | -15.000 | -15.000 | -15.000 | 00E+00 | |

IPES | -14.999 | -14.999 | -14.999 | 3.7E-15 | |

g02/0.803619 | _SOSMS | -0.799317 | -0.785304 | -0.799015 | 5.3E-04 |

_SOS | -0.690979 | -0.564224 | -0.637003 | 3.9E-02 | |

DE | -0.803619 | -0.792608 | -0.803004 | 2.5E-02 | |

I-ABC | -0.803619 | -0.778278 | -0.800094 | 7.1E-03 | |

MABC | -0.801982 | -0.749797 | -0.792412 | 1.2E-02 | |

IPES | -0.803607 | -0.792771 | -0.769193 | 7.9E-03 | |

g03/-1.000 | _SOSMS | -1.000 | -1.000 | -1.000 | 00E+00 |

_SOS | -1.000 | -0.937 | -0.989 | 1.9E-02 | |

DE | -1.000 | -1.000 | -1.000 | 3.9E-06 | |

I-ABC | -1.000 | -1.000 | -0.999 | 3.4E-06 | |

MABC | -1.000 | -1.000 | -1.000 | 00E+00 | |

IPES | -1.000 | -1.000 | -1.000 | 9.3E-06 | |

g04/-30665.539 | _SOSMS | -30665.539 | -30665.539 | -30665.539 | 00E+00 |

_SOS | -30665.539 | -30665.539 | -30665.539 | 4.2E-12 | |

DE | -30665.539 | -30665.539 | -30665.539 | 2.1E-05 | |

I-ABC | -30665.539 | -30665.539 | -30665.539 | 1.0E-05 | |

MABC | -30665.539 | -30665.539 | -30665.539 | 00E+00 | |

IPES | -30665.539 | -30665.539 | -30665.539 | 5.7E-12 | |

g05/5126.498 | _SOSMS | 5126.498 | 5126.498 | 5126.498 | 2.3E-07 |

_SOS | 5126.551 | 5468.185 | 5245.576 | 1.1E+02 | |

DE | 5126.498 | 5126.498 | 5126.498 | 1.7E-05 | |

I-ABC | 5126.498 | 5148.944 | 5131.861 | 5.5E+00 | |

MABC | 5126.484 | 5438.387 | 5185.714 | 7.5E+01 | |

IPES | 5126.498 | 5139.003 | 5197.991 | 2.4E+00 | |

g06/-6961.814 | _SOSMS | -6961.814 | -6961.814 | -6961.814 | 4.1E-13 |

_SOS | -6961.747 | -6953.748 | -6959.364 | 2.5E+00 | |

DE | -6961.814 | -6961.814 | -6961.814 | 2.3E-08 | |

I-ABC | -6961.814 | -6961.814 | -6961.814 | 1.8E-12 | |

MABC | -6961.814 | -6961.805 | -6961.813 | 2.0E-03 | |

IPES | -6961.814 | -6961.814 | -6961.814 | 3.8E-12 | |

g07/24.306 | _SOSMS | 24.306 | 24.306 | 24.306 | 5.2e-03 |

_SOS | 24.869 | 34.704 | 28.442 | 3.0E+00 | |

DE | 24.306 | 24.306 | 24.306 | 6.3E-06 | |

I-ABC | 24.311 | 24.677 | 24.366 | 6.9E-02 | |

MABC | 24.330 | 25.190 | 24.473 | 1.9E-01 | |

IPES | 24.307 | 24.316 | 24.333 | 6.6E-03 | |

g08/-0.095825 | _SOSMS | -0.095825 | -0.095825 | -0.095825 | 00E+00 |

_SOS | -0.095825 | -0.095825 | -0.095825 | 5.9E-10 | |

DE | -0.095825 | -0.095825 | -0.095825 | 8.4E-17 | |

I-ABC | -0.095825 | -0.095825 | -0.095825 | 2.8E-17 | |

MABC | -0.095825 | -0.095825 | -0.095825 | 00E+00 | |

IPES | 24.307 | 24.316 | 24.333 | 6.6E-03 | |

g09/680.630 | _SOSMS | 680.630 | 680.630 | 680.630 | 7.9E-05 |

_SOS | 681.374 | 687.418 | 683.451 | 2.3E+00 | |

DE | 680.630 | 680.630 | 680.630 | 2.2E-07 | |

I-ABC | 680.631 | 680.637 | 680.633 | 1.5E-03 | |

MABC | 680.634 | 680.653 | 680.640 | 4.0E-03 | |

IPES | 680.630 | 680.673 | 680.639 | 4.3E-02 | |

g10/7049.248 | _SOSMS | 7049.248 | 7158.173 | 7077.542 | 1.3E+01 |

_SOS | 7506.216 | 9816.488 | 8583.910 | 1.7E+02 | |

DE | 7049.248 | 7049.248 | 7049.248 | 9.0E-06 | |

I-ABC | 7049.321 | 7285.383 | 7124.042 | 5.9E+01 | |

MABC | 7053.904 | 7604.132 | 7224.407 | 1.3E+02 | |

IPES | 7051.341 | 7210.360 | 7376.721 | 4.6E+02 | |

g11/0.750 | _SOSMS | 0.750 | 0.750 | 0.750 | 00E+00 |

_SOS | 0.750 | 0.766 | 0.752 | 5.1E-03 | |

DE | 0.750 | 0.750 | 0.750 | 6.9E-14 | |

I-ABC | 0.750 | 0.750 | 0.750 | 2.4E-06 | |

MABC | 0.750 | 0.750 | 0.750 | 00E+00 | |

IPES | 0.750 | 0.750 | 0.750 | 7.9E-04 | |

g12/-1.000 | _SOSMS | -1.000 | -1.000 | -1.000 | 00E+00 |

_SOS | -0.293 | -0.293 | -0.293 | 6.8E-08 | |

DE | -1.000 | -1.000 | -1.000 | 00E+00 | |

I-ABC | -1.000 | -1.000 | -1.000 | 00E+00 | |

MABC | -1.000 | -1.000 | -1.000 | 00E+00 | |

IPES | -1.000 | -1.000 | -1.000 | 00E+00 | |

g13/0.053950 | _SOSMS | 0.053950 | 0.053953 | 0.053951 | 4.5E-06 |

_SOS | 0.054174 | 0.999820 | 0.314003 | 2.4E-01 | |

DE | 0.053954 | 0.054884 | 0.069631 | 7.6E-02 | |

I-ABC | 0.053958 | 0.054144 | 0.055130 | 2.7E-04 | |

MABC | 0.760 | 1.000 | 0.968 | 5.5E-02 | |

IPES | 0.053950 | 0.146260 | 0.453029 | 1.1E-01 |

From Table 2, some conclusion can be obtained. Firstly, for all benchmark functions, _SOSMS achieves better results than _SOS. It shows that the improvement of evolutionary strategy of SOS in Section 4.2 is effective. Secondly, _SOSMS can find the optimal values of 12 test functions except g02. DE did not find the theoretical optimal value only on g13. However fitness evaluation times of DE is significantly more than _SOSMS. I-ABC finds theoretical optimal values only on 9 test functions. Compared with I-ABC, _SOSMS only obtains poor results on g02 function. MABC can find the theoretical optimal value of 6 benchmark functions. In summary, _SOSMS proposed in this paper is significantly superior to the other four algorithms in the ability of finding theoretical optimal values. Thirdly, except for the function g10, the variance of _SOSMS is almost equal to 0. In addition, for most benchmark functions, the variance obtained by _SOSMS is less than other algorithms. It shows that the proposed algorithm has better stability than other algorithms.

Table 3.

Function | _SOSMS | MABC | ||
---|---|---|---|---|

FES | SR/% | FES | SR/% | |

g01 | 70990 | 100 | 105370 | 100 |

g02 | NA | 0 | 367360 | 16.67 |

g03 | 72957 | 100 | 98560 | 100 |

g04 | 24530 | 100 | 74380 | 100 |

g05 | 97430 | 100 | 308320 | 63.33 |

g06 | 12190 | 100 | 140160 | 96.67 |

g07 | 73051 | 100 | NA | 0 |

g08 | 1890 | 100 | 2750 | 100 |

g09 | 71763 | 100 | NA | 0 |

g10 | 176820 | 66.67 | NA | 0 |

g11 | 32418 | 100 | 4730 | 100 |

g12 | 4760 | 100 | 5280 | 100 |

g13 | 46803 | 100 | NA | 0 |

In order to verify the convergence speed of _SOSMS, _SOSMS is compared with MABC which has better optimization performance among the above contrast algorithms. These two algorithms run 30 times independently. The results are shown in Table 3. FES represents the average value of minimum function evaluation times that are needed to find the theoretical optimum value for each algorithm, and SR represents the success rate of reaching the theoretical optimal value in 30 independent experiments. NA shows that the algorithm does not find the theoretical optimal value when function evaluation times reach [TeX:] $$2.4 \times 10^{4}$$

From Table 3, _SOSMS can find the optimal value on 12 benchmark functions, the success rate of _SOSMS on the 10 test functions is 100%. But MABC can find the optimal value on 9 benchmark functions, and the success rate of MABC on only 5 test functions is 100%. It shows that _SOSMS is superior to MABC in robustness. In addition, except g02 and g11, function evaluation times that is needed to find the theoretical optimum value by _SOSMS is significantly less than MABC, which shows that _SOSMS is superior to MABC in convergence speed.

Through the above experiments, we can find that _SOSMS has obvious advantages in the optimization ability and convergence speed.

In this paper, an improved Symbiotic Organisms Search algorithm with mixed strategy based on adaptive constrained method (_SOSMS) is proposed. Firstly, an adaptive constrained is proposed to take full advantage of infeasible individuals, which can improve searching capabilities in the search space and avoid algorithms falling into local optimum. Secondly, the evolutionary strategy of SOS is improved to use the optimal individual for accelerating the convergence speed and search capabilities. Experiments on 13 benchmark functions show that _SOSMS has better performance than the other algorithms.

This work was supported in part by the National Natural Science Foundation of China (No. 61501107), the Education Department of Jilin province science and technology research project of “13th Five-Year” in2016 (No. 95), and the Project of Scientific and Technological Innovation Development of Jilin (No. 201750219, 201750227).

She received her M.S. degree in Signal and Information Processing from Harbin Engineering University in 2010. She received the Ph.D. degree in Signal and Information Processing from Harbin Engineering University in 2013. After then, she has become a teacher of Northeast Electric Power University, and her main research interests include evolutionary computing, and intelligent information processing.

She received her M.S. degree in Signal and Information Processing from Harbin Engineering University in 2010. She received the Ph.D. degree in Signal and Information Processing from Harbin Engineering University in 2013. After then, she has become a teacher of Northeast Electric Power University, and her main research interests include evolutionary computing, and intelligent information processing. She is currently the master’s degree in School of Electrical Engineering with Northeast Electric Power University. Her research interests are evolutionary computing and intelligent information processing.

She received her M.S. degree in Signal and Information Processing from Harbin Engineering University in 2010. She received the Ph.D. degree in Signal and Information Processing from Harbin Engineering University in 2013. After then, she has become a teacher of Northeast Electric Power University, and her main research interests include evolutionary computing, and intelligent information processing. She is currently the master’s degree in School of Electrical Engineering with Northeast Electric Power University. Her research interests are evolutionary computing and intelligent information processing. He is currently the master’s degree in School of Electrical Engineering with Northeast Electric Power University. His research interests are evolutionary computing and intelligent information processing.

- 1 Z. Y. Li, T. Huang, S. M. Chen, R. F. Li, "Overview of constrained optimization evolutionary algorithms,"
*Journal of Software*, vol. 28, no. 6, pp. 1529-1546, 2017.custom:[[[-]]] - 2 D. H. Xia, Y. X. Li, W. Y. Gong, G. L. He, "An adaptive differential evolution algorithm for constrained optimization problems,"
*Acta Electronica Sinica*, vol. 44, no. 10, pp. 2535-2542, 2016.custom:[[[-]]] - 3 H. C. Liu, Z. J. Wu, "Differential evolution algorithm using rotation-based learning,"
*Acta Electronica Sinica*, vol. 31, no. 10, pp. 2040-2046, 2015.custom:[[[-]]] - 4 H. Zhou. H. Zhao, M. Li, Y Cai, "Multi-strategy adaptive symbiotic organisms search algorithm,"
*Journal of Air Force Engineering University (Natural Science Edition)*, vol. 17, no. 4, pp. 101-106, 2016.custom:[[[-]]] - 5 A. K. Ojha, Y. R. Naidu, "Hybridizing particle swarm optimization with invasive weed optimization for solving nonlinear constrained optimization problems," in
*Proceedings of Fourth International Conference on Soft Computing for Problem Solving. New Delhi*, India: Springer, 2015;pp. 599-610. custom:[[[-]]] - 6 W. Gong, Z. Cai, D. Liang, "Adaptive ranking mutation operator based differential evolution for constrained optimization,"
*IEEE Transactions on Cybernetics*, vol. 45, no. 4, pp. 716-727, 2014.doi:[[[10.1109/TCYB.2014.2334692]]] - 7 W. Long, W. Z. Zhang, Y. F. Huang, Y. X. Chen, "A hybrid cuckoo search algorithm with feasibility-based rule for constrained structural optimization,"
*Journal of Central South University*, vol. 21, no. 8, pp. 3197-3204, 2014.custom:[[[-]]] - 8 D. Karaboga, B. Akay, "A modified artificial bee colony (ABC) algorithm for constrained optimization problems,"
*Applied Soft Computing*, vol. 11, no. 3, pp. 3021-3031, 2011.custom:[[[-]]] - 9 J. G. Zheng, X. Wang, R. H. Liu, "Epsilon-differential evolution algorithm for constrained optimization problems,"
*Journal of Software*, vol. 23, no. 9, pp. 2374-2387, 2012.custom:[[[-]]] - 10 X. J. Bi, L. Zhang, "Self-adaptiveεconstrained optimization algorithm,"
*Systems Engineering and Electronics*, vol. 37, no. 8, pp. 1909-1915, 2015.custom:[[[-]]] - 11 C. G. Cui, X. F. Yang, "Interior penalty rule based evolutionary algorithm for constrained optimization,"
*Journal of Software*, vol. 26, no. 7, pp. 1688-1699, 2015.custom:[[[-]]] - 12 Y. Liang, Z., Wan, D. Fang, "An improved artificial bee colony algorithm for solving constrained optimization problems,"
*International Journal of Machine Learning and Cybernetics*, vol. 8, no. 3, pp. 739-754, 2017.doi:[[[10.1007/s13042-015-0357-2]]] - 13 Y. Wang, Z. Cai, "Combining multiobjective optimization with differential evolution to solve constrained optimization problems,"
*IEEE Transactions on Evolutionary Computation*, vol. 16, no. 1, pp. 117-134, 2012.doi:[[[10.1109/TEVC.2010.2093582]]] - 14 T. Takahama, S. Sakai, "Efficient constrained optimization by the ε constrained adaptive differential evolution," in
*Proceedings of IEEE Congress on Evolutionary Computation*, Barcelona, Spain, 2010;pp. 1-8. custom:[[[-]]] - 15 M. Y. Cheng, D. Prayogo, "Symbiotic organisms search: a new metaheuristic optimization algorithm,"
*Computers & Structures*, vol. 139, pp. 98-112, 2014.custom:[[[-]]] - 16 T. P. Runarsson, X. Yao, "Stochastic ranking for constrained evolutionary optimization,"
*IEEE Transactions on Evolutionary Computation*, vol. 4, no. 3, pp. 284-294, 2000.doi:[[[10.1109/4235.873238]]]